Next Article in Journal
Response Characteristics of Harvester Bolts and the Establishment of the Strongest Response Structure’s Kinetic Model
Previous Article in Journal
Realizing the Potential of Eastern Uganda’s Smallholder Dairy Sector through Participatory Evaluation
Previous Article in Special Issue
Advancing Cassava Age Estimation in Precision Agriculture: Strategic Application of the BRAH Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Classification of Maize Growth Stages Based on Phenotypic Traits and UAV Remote Sensing

1
College of Information and Management Science, Henan Agricultural University, Zhengzhou 450002, China
2
Key Lab of Smart Agriculture System, Ministry of Education, China Agricultural University, Beijing 100083, China
3
Key Laboratory of Quantitative Remote Sensing in Agriculture, Ministry of Agriculture and Rural Affairs, Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(7), 1175; https://doi.org/10.3390/agriculture14071175
Submission received: 27 May 2024 / Revised: 1 July 2024 / Accepted: 16 July 2024 / Published: 18 July 2024
(This article belongs to the Special Issue Precision Remote Sensing and Information Detection in Agriculture)

Abstract

:
Maize, an important cereal crop and crucial industrial material, is widely used in various fields, including food, feed, and industry. Maize is also a highly adaptable crop, capable of thriving under various climatic and soil conditions. Against the backdrop of intensified climate change, studying the classification of maize growth stages can aid in adjusting planting strategies to enhance yield and quality. Accurate classification of the growth stages of maize breeding materials is important for enhancing yield and quality in breeding endeavors. Traditional remote sensing-based crop growth stage classifications mainly rely on time series vegetation index (VI) analyses; however, VIs are prone to saturation under high-coverage conditions. Maize phenotypic traits at different growth stages may improve the accuracy of crop growth stage classifications. Therefore, we developed a method for classifying maize growth stages during the vegetative growth phase by combining maize phenotypic traits with different classification algorithms. First, we tested various VIs, texture features (TFs), and combinations of VI and TF as input features to estimate the leaf chlorophyll content (LCC), leaf area index (LAI), and fractional vegetation cover (FVC). We determined the optimal feature inputs and estimation methods and completed crop height (CH) extraction. Then, we tested different combinations of maize phenotypic traits as input variables to determine their accuracy in classifying growth stages and to identify the optimal combination and classification method. Finally, we compared the proposed method with traditional growth stage classification methods based on remote sensing VIs and machine learning models. The results indicate that (1) when the VI+TFs are used as input features, random forest regression (RFR) shows a good estimation performance for the LCC (R2: 0.920, RMSE: 3.655 SPAD units, MAE: 2.698 SPAD units), Gaussian process regression (GPR) performs well for the LAI (R2: 0.621, RMSE: 0.494, MAE: 0.397), and linear regression (LR) exhibits a good estimation performance for the FVC (R2: 0.777, RMSE: 0.051, MAE: 0.040); (2) when using the maize LCC, LAI, FVC, and CH phenotypic traits to classify maize growth stages, the random forest (RF) classification method achieved the highest accuracy (accuracy: 0.951, precision: 0.951, recall: 0.951, F1: 0.951); and (3) the effectiveness of the growth stage classification based on maize phenotypic traits outperforms that of traditional remote sensing-based crop growth stage classifications.

1. Introduction

Maize, a staple food crop and vital industrial material, is extensively used in diverse sectors, such as food, feed, and raw materials [1]. Phenology, the study of recurring natural events that are influenced by environmental factors and human activities, encompasses plant germination, leaf unfolding, flowering, leaf discoloration, and leaf fall. In agricultural production management, understanding crop growth stages is pivotal for crop breeding management and yield prediction [2]. Therefore, an accurate classification of crop growth stages has significant implications for making informed agricultural decisions and ensuring food production safety [3].
Conventional methods of gathering crop phenology information involve field surveys that are conducted by technical experts. However, these manual observations are limited to small-scale, time-consuming assessments [4]. In contrast, remote sensing technology has provided new possibilities. Significant strides have been made in using remote sensing data for classifying crop growth stages. Traditional remote sensing-based crop growth stage classifications primarily rely on time series analyses of the vegetation index (VI). However, time series VI curves are susceptible to data noise, necessitating the use of algorithms, such as Fourier filtering [5] and asymmetric Gaussian functions [6], to mitigate the noise.
Moreover, the requirement that time series VI data cover the entire growth period leads to significant lags [7]. Additionally, VIs tend to saturate under high-coverage conditions in crops [8]. Furthermore, there are often systematic deviations between the crop growth stages that are extracted based on curve feature points and actual agronomic growth stages [9] that pose numerous challenges in practical applications.
In recent years, unmanned aerial vehicles (UAVs) have rapidly been incorporated in agricultural monitoring. With their high flexibility and lower costs, UAVs have greatly facilitated agricultural field data collection efforts [10,11]. The sensors mounted on UAVs actively acquire timely and highly spatially resolved high-throughput phenotypic information on crops, and this methodology is now widely applied in crop phenotypic trait monitoring (such as the leaf area index (LAI), leaf chlorophyll content (LCC) [12], crop height (CH), and biomass [13,14,15]). In addition, some scholars have conducted studies using UAV images to identify the maize tasselling stage [4]. The crop LCC represents the turnover of leaf biochemical components [16]. The LAI is a widely used indicator of crop growth status [17]. The fractional vegetation cover (FVC) describes the spatial distribution of crop growth and can be used to effectively characterize the growth conditions of maize [18,19]. CH is crucial for describing plant growth status during the vegetative development of maize [20].
Use of phenotypic traits such as the LCC, LAI, FVC, and CH for classifying maize growth stages may address the impact of VI saturation on growth stage classifications. The following tasks were conducted in this study: (1) The phenotypic traits (e.g., LCC, LAI, FVC) during the maize growth and development stages were estimated. (2) The vegetative growth stages based on maize phenotypic traits (such as LCC, LAI, FVC, and CH) were classified. The first task involves estimating crop phenotypic traits with high precision. A common approach is to determine the relationships among various VIs, such as the normalized difference vegetation index (NDVI), and crop phenotypic traits [21]. However, considering the significant differences in crop canopy structures during different growth stages, the changes in VI may not effectively reflect these differences [22]. Yue et al. [23] reported that using image texture features (TFs) combined with VIs improves the accuracy of winter wheat ground biomass estimations. Manuel Campos-Taberner et al. [24] concluded that TFs derived from high-resolution remote sensing images may be more effective than spectral features in estimating rice LAI. These studies provide new perspectives for estimating crop phenotypic traits. Considering the similarities among winter wheat, rice, and maize, this study attempted to use image TFs to estimate maize phenotypic traits. The second task involves classifying the growth stages based on maize phenotypic traits during the vegetative growth phase. Relying on a single phenotypic trait cannot accurately evaluate maize growth stages. Combining multiple crop phenotypic traits can help improve the classification of maize growth stages. This study used different crop phenotypic trait combinations as inputs and combined them with different machine learning models to determine the maize growth stages.
In this study, we aimed to devise a method for classifying maize growth stages during the vegetative growth phase by utilizing maize phenotypic traits in conjunction with various classification algorithms. To achieve this, we conducted the following tasks:
(1) We used UAVs to collect maize canopy orthoimage data from seven phases (i.e., from the emergence to tasselling stages) and conducted ground measurements with LCC, LAI, and CH data. We then converted the LAI to the FVC.
(2) We tested the combined use of VI, TF, and VI+TF to assess the accuracy of estimating the LCC, LAI, and FVC. This determination helped us identify the most accurate method for estimating maize phenotypic traits and facilitated CH extraction.
(3) We tested the growth stage classification accuracy by using different combinations of maize phenotypic traits, which led to the identification of the optimal combination of phenotypic traits and classification methods.
(4) We compared this method with traditional classification methods based on vegetation index spectral information.

2. Materials and Methods

2.1. Study Area

The study area is located in Xingyang City, Zhengzhou City, Henan Province, China (Figure 1, N: 34°36′–34°59′, E: 113°7′–113°30′). Xingyang City is located at the junction of the middle and lower reaches of the Yellow River in central Henan Province and has a warm temperate continental monsoon climate. The annual average temperature is approximately 14.8 °C, and the annual average precipitation is approximately 608.8 mm. A total of 160 maize planting plots were arranged in 10 rows and 16 columns. Each planting plot had dimensions of 2.5 m × 5 m, with four rows of maize planted with 9 to 10 plants per row. We collected data for the seven stages of maize growth during the summer of 2023, from emergence to tasselling. These stages corresponded to P1 (30 June), P2 (5 July), P3 (8 July), P4 (16 July), P5 (22 July), P6 (27 July), and P7 (10 August).

2.2. UAV Multispectral Images and Collection of Maize Phenotypic Traits

2.2.1. Measurements of Maize Phenotypic Traits

The LCC values were measured using a portable SPAD-502 sensor (Soil and Plant Analyzer Development, Tokyo, Japan). The procedure involved selecting the first and second fully expanded leaves above the maize plants for the measurements and considered both the tail and middle sections in non-vein areas. These measurements were repeated three times at the center of each maize plot, and the mean values were recorded as the final results. The LAI was obtained using an LAI-2200C Plant Canopy Analyzer (LI-COR Biosciences, Lincoln, NE, USA). Prior to the measurements, the light intensities were measured in an open backlight area. Subsequently, the LAI measurements were obtained both parallel and perpendicular to the maize rows. The maize phenotypic trait measurements were conducted on only one planting plot from each pair of adjacent plots. The measurement method for CH involves randomly selecting three maize plants for height measurements and using their average height as the average plant height for each maize planting plot. The number of maize plots measured per stage was 80, and the field measurement results are presented in Table 1.
The conversion of LAI to FVC uses Formula (1) as follows [25]:
F V C = 1 e G × Ω × L A I cos ϴ
where G, ϴ, and Ω represent the leaf projection factor in the spherical direction, solar zenith angle, and clumping index (G = 0.5, ϴ = 0, Ω = 1), respectively.
Due to the small size of maize seedlings in the early stages, LCC was not measured for P1 and P2, and LAI was not measured for P1 to P3. Based on the available data, we had a total of 800 sets of LCC data from P3 to P7, 640 sets of LAI data from P4 to P7, and 640 sets of FVC data from P4 to P7 for regression estimation.

2.2.2. UAV Flight and Average Spectral Extraction of Maize

We used a DJI Phantom 4 multispectral UAV (DJI Technology Co., Ltd., Shenzhen, China) as the remote sensing platform. The UAV has a total weight of 1487 g and has six camera sensors. The sensor size is 1/2.9” and consists of one RGB sensor for visible light imaging and five monochrome sensors for multispectral imaging. Each sensor has an effective pixel resolution of 2.08 MP, and the monochrome sensors include the R, G, B, red edge (RE), and near-infrared (NIR) bands. Considering the UAV endurance capabilities and safety concerns, the DJI Phantom 4 multispectral UAV flight altitude was set to 20 m. UAV remote sensing data were acquired during periods with stable solar radiation and clear weather conditions, which typically occurred between 10:00 and 14:00. The overlap settings were 80% in the flight direction and 75% in the lateral direction. The UAV autonomously followed predefined flight paths during data collection to capture the image data. After UAV data collection, the captured images were imported into DJI Terra software V4.0.1 version (DJI, Shenzhen, China), where automated image stitching was performed based on the UAV and camera parameters.
The multispectral UAV imagery was utilized to construct regions of interest (ROIs) based on the planting plots. Subsequently, ENVI 5.3 software (Exelis Visual Information Solutions, Boulder, CO, USA) was used to extract spectral information from the ROIs of each planting plot in the UAV images. A digital surface model (DSM) and five commonly used VIs were extracted, namely, the green normalized difference vegetation index (GNDVI), leaf chlorophyll index (LCI), normalized difference red edge (NDRE), normalized difference vegetation index (NDVI), and optimized soil-adjusted vegetation index (OSAVI).
We extracted the spectral information from the multispectral UAV images collected during the summer of 2023, with 160 sets of spectral information extracted for each growth stage based on the ROIs, totaling 1120 sets. These were categorized into six classes: emergence stage (ES), three-leaf stage (TLS), jointing stage (JS), small trumpet stage (STS), big trumpet stage (BTS), and tasselling stage (TS). Considering the characteristics of the data and the sample size, the data were subsequently divided into training and testing sets at a ratio of 6:4. Sample photos of each growth stages are shown in Figure 2.
It is important to note that our data collection periods are fixed as P1–P7, which do not correspond exactly to the growth stages of maize. This is because the growth stages can vary even among different maize plots within the same period. For each maize plot during P1–P7, we determine the growth stages with the assistance of agronomy experts to ensure that the data comprehensively cover the entire growth period from ES to TS.

3. Methods

3.1. Methodological Framework

For this study, we primarily compare and analyze maize growth stage classifications that are based on remote sensing VIs and those that are based on maize phenotypic traits. The research will be described in the following three aspects, as depicted in the technical roadmap in Figure 3:
(1) Spectral information is extracted from UAV remote sensing images, and the extracted VI information is utilized for classifying maize growth stages.
(2) Based on the experimentally measured LCC and LAI and converted FVC data, we estimated the LCC, LAI, and FVC values of maize during its vegetative growth stages and extracted the crop heights (CHs).
(3) Different combinations of LCC, LAI, FVC, and CH phenotypic traits were tested. The effectiveness of the various combinations in classifying the maize growth stages was evaluated, and the optimal strategy for mapping maize growth stages was selected.
Figure 3. Methodological framework.
Figure 3. Methodological framework.
Agriculture 14 01175 g003

3.2. Vegetation Indices and Texture Features

3.2.1. Vegetation Indices

GNDVI’s sensitivity to green light reflection makes it particularly effective in detecting vegetation health. Its estimation of the LCC is valuable due to its direct correlation with chlorophyll content. LCI directly utilizes the ratio of red and near-infrared light, rendering it highly sensitive to changes in chlorophyll content. NDRE is more sensitive to moderate to high concentrations of chlorophyll, suitable for monitoring the LCC and LAI in later stages of crop growth, especially during leaf densification. NDVI, one of the most commonly used vegetation indices, is widely applied for monitoring vegetation growth and coverage. NDVI is particularly effective in estimating the LAI as it reflects vegetation biomass and density well. OSAVI exhibits higher accuracy in regions heavily influenced by soil background, providing more stable estimates of the LAI and LCC by minimizing the impact of soil reflectance through optimized algorithms.
The VI primarily reflects the differences in vegetation reflectance between the visible and near-infrared spectral bands. Based on the advantages of these five VIs in estimating maize traits, this study selected GNDVI, LCI, NDRE, NDVI, and OSAVI for estimating for estimating the LCC, LAI, and FVC values. The specific calculation formulas are shown in Table 2.

3.2.2. Texture Features

Image TFs reflect spatial positional characteristics by capturing changes in grayscale values within pixel neighborhoods. TFs do not exhibit saturation effects and represent microscopic structural characteristics, unlike VIs. TFs can characterize the textures, structures, and spatial distributions of surfaces. In this study, eight TFs were extracted using the grey-level co-occurrence matrix (GLCM) method: mean (Mea), variance (Var), homogeneity (Hom), contrast (Con), dissimilarity (Dis), entropy (Ent), angular second moment (ASM), and correlation (Cor). Considering the maize planting environment and the UAV image pixel size, a 3 × 3 window size was utilized for TF extraction. This window size balances spatial resolution and computational complexity effectively [10]. The calculation formulas are shown in Table 3.

3.3. Extraction of Maize Crop Heights

Maize plant height is a significant indicator of the vegetative growth process of maize. A digital elevation model (DEM) delineates the spatial distributions of regional topographic features that were obtained using data collection methods such as contour lines or similar solid models, followed by interpolation of the data. The DSM refers to a ground elevation model that encompasses features such as surface structures, bridges, and tree heights. Although DEMs solely capture elevation information related to terrain, they do not include other surface details. Conversely, DSMs extend beyond DEMs by incorporating additional surface features, including information on crop heights, in addition to ground-level data. A crop surface model (CSM), which denotes maize CHs in this study, can be derived using Formula (2) [23,31,32].
CSM = DSM − DEM

3.4. Machine Learning Techniques

3.4.1. Machine Learning-Based Regression Techniques

CatBoost is a gradient-boosting algorithm specifically designed to handle categorical features. Its unique feature is that it directly handles raw categorical features without requiring preprocessing steps such as one-hot encoding. CatBoost generates models through symmetric tree structures and gradient boosting and demonstrates excellent performance on large-scale and high-cardinality categorical datasets [33].
Gaussian process regression (GPR) is a nonparametric regression method based on Bayesian inference. It models the distribution of the output variable using Gaussian processes. It can estimate uncertainty, making it suitable for small-sample datasets and situations where accurate estimations of the classification uncertainty are required [34].
Linear regression (LR) is a fundamental method that establishes linear relationships between input features and output variables. The model fits the data by minimizing the sum of squared residuals, making it simple and intuitive but limited in its ability to model complex nonlinear relationships [35].
Ridge regression (RR) is a regularization method of linear regression that constrains model parameters by adding an L2 regularization term. It mainly addresses multicollinearity (high correlations between features) issues. Regularization helps prevent overfitting [36].
Random forest regression (RFR) is an ensemble learning method that reduces the risk of overfitting by constructing multiple decision trees and integrating their prediction results for regression tasks [37].
Support vector machine regression (SVR) achieves regression tasks by finding a hyperplane that minimizes the distances between data points and the hyperplane. SVR has strong modelling capabilities for nonlinear relationships and can adapt to different regression tasks by selecting kernel functions [38].
K-nearest neighbors regression (KNNR) is a nonparametric method that estimates new data points by analyzing the nearest output values. It is typically suitable for regression tasks with smaller datasets and less noise [39].

3.4.2. Machine Learning-Based Classification Techniques

RF is an ensemble learning method that is commonly used for both classification and regression tasks. It is built on the foundation of decision trees; multiple decision trees are constructed, and their results are integrated to enhance the model performance and generalization capability [40].
A support vector machine (SVM) is a supervised learning method that is used for binary and multiclass classifications. SVM produces classifications by finding the optimal hyperplane in the feature space and maximizes the projections of samples from different classes onto this hyperplane. It excels in handling high-dimensional data and nonlinear relationships, exhibiting strong generalizability [38].
A multilayer perceptron (MLP) is a type of deep neural network with multiple hidden layers and nonlinear activation functions. A MLP is trained using the backpropagation algorithm and can learn complex relationships that are present in the input data. It performs well on large-scale, high-dimensional data and demonstrates strong adaptability to nonlinear classification tasks [41].
Naive bayes (NB) classifier is based on Bayes’ theorem and models prior probabilities and conditional probabilities for classification. It assumes independence among features and is suitable for handling high-dimensional data. The NB classifier, as a typical form, is characterized by its simplicity and efficiency and is particularly suitable for small-sample datasets [42].
Stacking is an ensemble learning method aimed at improving overall model performance by combining the classification results of multiple base models. Compared to traditional single models, stacking exhibits a stronger generalization capability and is suitable for various complex data distributions and model fitting problems [43].

3.5. Accuracy Evaluation

3.5.1. Evaluation of the Accuracy of Regressions for Maize Phenotypic Traits

The coefficient of determination (R2) is a standard that is used to evaluate the fit of a regression model. It represents the proportion of the variation in the dependent variable that is explained by the model. Its calculation formula is shown in Formula (3).
The root mean squared error (RMSE) is a widely used metric for evaluating the performance of regression models. It measures the average magnitude of the differences between the estimated values and actual observed values. Its formula is shown in Formula (4).
The mean absolute error (MAE) is another metric that is used to assess the performance of regression models. It represents the average absolute difference between the estimated values and actual observed values. Its formula is shown in Formula (5).
R 2 = 1 i = 1 n y i y i ^ 2 i = 1 n y i y ¯ 2
R M S E = i = 1 n y i y i ^ 2 n
M A E = 1 n i = 1 n y i ^ y i
Here, y i represents the actual measured sample value, y i ^ represents the estimated sample value, and n represents the number of samples. For the same sample data, if the model yields a higher coefficient of determination and a lower root mean squared error, it is generally considered to have greater accuracy.

3.5.2. Evaluation of the Accuracy of Maize Growth Stage Classifications

A confusion matrix reflects the accuracy of sample classification and is represented in matrix form with n rows and n columns. Each column of the confusion matrix represents an estimated category, with the total in each column indicating the number of data estimated as that category; each row represents the true category data, with the total in each row indicating the number of data instances in that category. The values in each column indicate the numbers of true data points estimated for that category.
In classification problems, by comparing the classification predictions with the true results, one can evaluate the experimental effectiveness using the confusion matrix method. Based on the confusion matrix, one can calculate the model accuracy to represent its overall precision. The accuracy is defined as the proportion of correctly estimated samples by the classifier relative to the total number of samples, and its formula is shown in Formula (6). The precision is the ratio of the number of correctly estimated positive samples to the total number of samples estimated to be positive and is used to measure the model’s classification ability, and its formula is shown in Formula (7). The recall measures the ratio of correctly estimated positive samples to the total positive samples, defined as the proportion of samples actually belonging to the positive class that the model correctly predicts as positive, and its formula is shown in Formula (8). The F1 score, composed of the harmonic mean between precision and recall, can comprehensively consider precision and recall, measure the stability of model performance, and reflect the generalization ability of the model; its formula is shown in Formula (9).
A c c u r a c y = T P + T N T P + F P + F N + T N
P r e c i s i o n = T P T P + F P
R e c a l l = T P T P + F N
F 1 S c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
Here, TP represents true positive samples, TN represents true negative samples, FP represents false positive samples, and FN represents false negative samples.

4. Results

4.1. Maize Growth Stage Classification Based on VIs and Machine Learning Techniques

We use the extracted VI information and employ several widely recognized high-performance classifiers, specifically RF, SVM, MLP, NB, and stacking, to classify the maize growth stages. The classification results are shown in Table 4. All the classification methods achieved good performance, with the MLP showing the best overall classification performance (accuracy: 0.904, precision: 0.907, recall: 0.904, and F1: 0.904). The corresponding confusion matrix for the MLP is shown in Figure 4. This indicates that when VI information is used as a feature input, the MLP classification method can achieve better classification results for the maize growth stages.

4.2. Estimation of Maize Phenotypic Traits

4.2.1. Correlation Analysis Among Maize Phenotypic Traits, VIs, and TFs

Based on the VIs, we conducted a maize correlation analysis of the LCC, LAI, and FVC, as shown in Figure 5. The results indicate that the correlation coefficients for the GNDVI, LCI, NDRE, NDVI, OSAVI, and LCC were 0.770, 0.660, 0.485, 0.803, and 0.847, respectively. The OSAVI showed the highest correlation with LCC (0.847), while the NDRE exhibited the lowest correlation (0.485). The correlation coefficients between the five VIs and the LAI were 0.579, 0.576, 0.523, 0.542, and 0.394, respectively, with the GNDVI showing the highest correlation with the LAI (0.579) and the OSAVI demonstrating the lowest correlation (0.394). The correlation coefficients between the five VIs and the FVC were 0.658, 0.601, 0.500, 0.639, and 0.498, respectively, with the GNDVI showing the highest correlation with the FVC (0.658) and the OSAVI exhibiting the lowest correlation (0.498).
Notably, the results of this study indicate that the correlation between NDRE and the LCC is not as strong as that between OSAVI and the LCC. This may be because this study focused on maize growth stages, primarily the vegetative growth stages. Therefore, soil background effects may influence the correlation coefficients between the LCC and NDRE in the early stages, while the OSAVI specifically optimizes the soil background, resulting in a greater correlation.
Based on the correlation analysis of VIs and the LCC, LAI, and FVC, the correlations between OSAVI image texture features and the LCC, as well as the correlations between the GNDVI image texture features and the LAI and FVC, were analyzed. The correlation analysis between TFs and LCC, LAI, and FVC is presented in Figure 6. According to the correlation analysis, LCC had a greater correlation with Mea (0.735); LAI had greater correlations with Mea (0.574), ASM (0.482), and Hom (0.462); and FVC had greater correlations with Mea (0.654), ASM (0.486), Con (−0.475), and Hom (0.464).

4.2.2. Estimation of Maize Phenotypic Traits Based on the VI

We tested the estimation performances of the LCC, LAI, and FVC for different models when the five VIs were used as the feature variable input to the model, and the results for the validation set are shown in Table 5. Based on the model estimation results, when the VI was used as the feature variable input, the GPR regression model exhibited good estimation performance for the LCC (R2: 0.900, RMSE: 4.071 SPAD units, and MAE: 3.192 SPAD units), LAI (R2: 0.621, RMSE: 0.494, and MAE: 0.397), and FVC (R2: 0.730, RMSE: 0.060, and MAE: 0.044).

4.2.3. Estimation of Maize Phenotypic Traits Based on TFs

Similarly, we tested the estimation performances of the LCC, LAI, and FVC for different models when the TFs were used as the feature variable input to the models, and the results for the validation set are shown in Table 6. Based on the model estimation results, when the TFs were used as the feature variable input to the models, the RFR regression model exhibited good estimation performance for the LCC (R2: 0.898, RMSE: 4.113 SPAD units, and MAE: 3.065 SPAD units) and LAI (R2: 0.641, RMSE: 0.482, and MAE: 0.378), while the LR regression model showed good estimation performance for the FVC (R2: 0.575, RMSE: 0.060, and MAE: 0.045).

4.2.4. Estimation of Maize Phenotypic Traits Based on the VI+TFs

Similarly, we tested the estimation performance of the LCC, LAI, and FVC for different models when VI and TF were used as the feature variables input to the model, and the results for the validation set are shown in Table 7. Based on the model estimation results, when VI and TF were jointly used as the feature variables input to the model, the RFR regression model exhibited good estimation performance for the LCC (R2: 0.920, RMSE: 3.655 SPAD units, and MAE: 2.698 SPAD units), the GPR regression model showed good estimation performance for the LAI (R2: 0.621, RMSE: 0.494, and MAE: 0.397), and the LR regression model demonstrated good estimation performance for the FVC (R2: 0.777, RMSE: 0.051, and MAE: 0.040).
By comparing the estimation performances of the LCC, LAI, and FVC for different models (Table 7), this study adopted a comprehensive approach using both the VI and TF to estimate the maize phenotypic traits. Specifically, the RFR model is chosen for LCC estimation, the GPR model is chosen for LAI estimation, and the LR model is chosen for FVC estimation. Scatter plots of the three optimal estimation models using the validation set are shown in Figure 7.

4.2.5. Maize Crop Height Estimation Based on CSMs

We conducted CSM information extraction using DSM and DEM data, where CSM represents the average plant height of each maize planting plot. The results of comparing the extracted CSM information, which represents the estimated maize plant heights, and the measured maize plant heights are shown in Figure 8. The research results indicate that UAV DSM and DEM data can be used to extract maize plant heights with high accuracy (R2: 0.935, RMSE: 0.332 m, and MAE: 0.295 m).
It should be noted that our measurement method uses three randomly selected maize plants for height measurements and takes their average height as the average plant height of each maize planting plot. During the early stage of maize nutrition, maize plants are generally shorter and sparsely distributed. The UAV collected orthoimages of the maize plant canopy during this stage, which may have resulted in small differences between the DSM and DEM. Through a comparative analysis (R2: 0.935), the fit between the estimated plant heights and the measured plant heights was good and met the requirements of this study, so no further secondary corrections were needed.

4.3. Maize Growth Stage Mapping Based on Phenotypic Traits

4.3.1. Maize Growth Stage Classification Based on Phenotypic Traits

LCC, LAI, FVC, and CH: This combination encompasses all four traits, providing a comprehensive dataset for accurate classification. It captures various aspects of plant growth, including canopy structure, leaf health, and overall biomass. LCC, FVC, and CH: This combination excludes the LAI, simplifying the feature set while maintaining high classification accuracy through the utilization of other significant traits. LAI, FVC, and CH: Excluding LCC, this combination focuses on traits related to structure and coverage, remaining effective in classifying growth stages. FVC and CH: This minimalistic combination uses fundamental traits of coverage and height to achieve reasonably accurate classification. LCC, LAI, and CH: This combination captures leaf health, the leaf area index, and height, providing a balance between structural and health-related traits. LAI and CH: This combination focuses on the canopy density and height, effective for certain growth stages. LCC and CH: This combination captures the chlorophyll content and height, offering insights into plant health and development.
Although the phenotypic traits for P1–P3 are incomplete, the VI and TF information for this stage is complete. We used regression estimation methods to estimate the complete maize trait parameters for the classification of the maize growth stages. Based on the estimated maize LCC, LAI, FVC, and CH values, we classified the maize growth stages. Specifically, different phenotypic trait combinations were utilized for maize growth stage classifications using RF, SVM, MLP, NB, and stacking classifiers. The classification results for different phenotypic trait combinations are shown in Table 8.
Notably, while the maize phenotypic traits P1-P3 are not fully comprehensive, the VI and TF information at this stage is complete. We estimated the complete phenotypic traits using regression methods for maize growth stage classification.
These results indicate that different phenotypic trait combinations provide distinct results for maize growth stage classifications. According to the classification results, the combination of the LCC, LAI, FVC, and CH yields better classification performance. Therefore, we selected this combination of maize phenotypic traits for classifying the growth stages in this study. Moreover, for the combination of the LCC, LAI, FVC, and CH, all the mentioned classification methods achieved satisfactory classification performances. Among them, the RF classification method exhibited the best overall classification performance (accuracy: 0.951, precision: 0.951, recall: 0.951, and F1: 0.951), and the corresponding confusion matrix is shown in Figure 9.

4.3.2. Maize Growth Stage Mapping

After comparing and analyzing the classification results presented in previous chapters, we selected the optimal combination of phenotypic traits (e.g., LCC, LAI, FVC, and CH). We chose to utilize the RF classifier, which achieved optimal performance metrics (accuracy: 0.951, precision: 0.951, recall: 0.951, and F1: 0.951), for classifying the growth stages of all the maize planting plots. Subsequently, we mapped the maize growth stages based on the obtained classification results. The results are presented in Figure 10.
The mapping results presented above indicate that during stage P1, all maize planting areas were in the emergence stage. During stage P2, all the maize planting areas were in the three-leaf stage. By stage P3, there was noticeable differentiation in the maize growth stages, with over half of the maize planting areas entering the jointing stage, while the remaining areas were still in the three-leaf stage.
Moving to stage P4, most of the maize planting areas entered the small trumpet stage, while a few remained in the jointing stage. In stage P5, the vast majority of the maize planting areas reached the big trumpet stage, with very few remaining in the small trumpet stage or tasselling stage. By stage P6, most maize planting areas were still in the big trumpet stage, although a few had already entered the tasselling stage. Finally, in stage P7, nearly all maize planting areas were in the tasselling stage, with only three areas remaining in the big trumpet stage.

5. Discussion

5.1. Impact of Different Phenotypic Traits on Maize Growth Stage Classification

Previous studies have shown that the LAI, FVC, and CH can indicate the spatial changes in maize plant growth, particularly during significant shifts in the maize vegetative growth stages. Mohamed Mouafik et al.’s [44] research highlights the pivotal role of the leaf area index (LAI) in assessing vegetation vitality, crucial for agricultural and environmental studies. LP de Magalhães et al. [45] evaluated which VIs have the strongest correlations with maize LAI and compared two regression methods based on UAV imagery. Liu et al. [46] constructed a new method to improve the accuracy of LAI estimation. This method can be improved by introducing a quantitative method to account for the contribution of soil information, which eliminates soil interference in maize LAI estimation. LAI enables the evaluation of canopy density, photosynthetic activity, tree health, and plant growth stages across various temporal and spatial scales and significantly contributes to yield prediction and biomass monitoring.
Gitelson et al.’s [47] study indicated that the FVC reflects the growth status of crops during a specific growth period and is closely related to the crop growth period. Guo et al.’s study [48] highlighted that the CH serves as a crucial agronomic indicator that is capable of providing insights into the growth status. MAJ Ferraz et al.’s [49] research posits that efficient and accurate assessment of plant height is paramount in appraising maize’s growth potential. Hence, they are suitable for use as maize phenotypic trait combinations for classifying maize growth stages. Zhai et al.’s [50] study posits that the LCC plays a vital role in monitoring crop growth. Considering that the LCC also changes with the maize nutrition growth stage, this study innovatively incorporated the LCC into a combination of crop phenotypic traits. According to the classification results, when the LCC was included in the combined analysis of the input phenotypic traits, there was a noticeable improvement in classification accuracy (Table 8). This suggests that changes in maize phenotypic traits during the vegetative growth stages can be demonstrated by the LCC. In this study, different maize phenotypic trait combinations were used for classification based on maize phenotypic traits. The combinations of the LCC, LAI, FVC, and CH phenotypic traits provide information on different crop vegetative growth levels, thus resulting in good growth stage classification effects (Table 8).

5.2. Comparative Analysis with Traditional Crop Growth Stage Classification Methods

Previous studies have focused primarily on extracting crop growth stage information directly from spectral or textural information. For instance, R Rosle et al. [51] chose to utilize the NDVI generated by UAV to monitor the changes in rice crops, starting from the day they were planted and continuing through the eleventh day of planting.
Yang et al. [52] determined the sowing and harvesting dates of maize and soybeans by constructing NDVI curves based on time series data and employing a dynamic threshold method. Alvaro Murguia-Cozar et al. [53] extracted texture and vegetation color indices from Sentinel-2 imagery and classified maize growth stages using machine learning algorithms. The quadratic SVM model emerged as the best classifier for maize crop phenology, with an overall accuracy of 82.3%. Ye et al. [54] proposed a method based on derivative dynamic time warping and established a maize growth stage detection model based on Sentinel-2 time series data. The results demonstrated that the RMSE of the detected phenology for corn was less than 6 days overall. Ernesto Sifuentes-Ibarra et al. [55] utilized remote sensing data to establish enhanced vegetation index (EVI) and NDVI models and proposed a method for classifying maize growth stages in large irrigation areas. This model indicated that at the beginning of the crop season, the precision in monitoring the phenological phases was more than 92% using the two VIs, and it decreased to 86.6% at the end of the crop season.
This study demonstrated that the effectiveness of the growth stage classification methods based on maize phenotypic traits surpassed that of traditional methods that rely on remote sensing VI and machine learning models. We attribute this outcome to the limited sensitivity of VI information to changes in maize growth during its vegetative growth stages. For instance, during the mid-to-late stages of maize vegetative growth, when the FVC is relatively high, the spectral indices often exhibit saturation trends, which fail to accurately represent crop growth changes, consequently leading to lower accuracy in classifications based on the VI. In contrast, classifications based on maize phenotypic traits can directly leverage the differences in phenotypic traits at different growth stages and thereby more precisely determine the various maize growth stages. The results of this study suggested that utilizing phenotypic trait combinations (such as the LCC, LAI, FVC, and CH) for growth stage classification may mitigate these issues and achieve greater classification accuracy.

5.3. The Advantages and Disadvantages of the Current Research

This study utilizes various traits closely related to the growth and development of maize, such as the LCC, LAI, FVC, and CH, which are directly involved in the classification of growth stages. By capturing information about maize growth and development from different perspectives, it provides support for obtaining highly accurate classification results. Traditional satellite remote sensing classification primarily relies on time series VI analysis. However, VIs tend to saturate under high-coverage conditions in the later stages of crop growth, leading to a decrease in classification accuracy. Moreover, satellite remote sensing classification methods have drawbacks such as long cycles and poor flexibility, making it challenging to monitor crop growth stages in a timely manner and achieve high-precision classification. The combination of the LCC, LAI, FVC, and CH comprehensively reflects maize growth information, effectively avoiding the shortcomings of traditional classification methods.
However, there are still some shortcomings. Compared to satellite remote sensing methods, UAVs cover smaller areas during flights, typically lasting approximately 30 min per session. To ensure two-hour flight times, multiple batteries are needed, which makes them unsuitable for large-scale crop growth stage classification. Alternative methods may yield better results for certain maize growth stages. For instance, identifying the tasselling stage could utilize object detection algorithms by recognizing maize tassels in UAV images.
Although we have selected the LCC, LAI, FVC, and CH as maize phenotypic traits, there are potentially crucial phenotypic traits that have not been considered. Ignoring factors such as soil nutrients and environmental temperature may limit the model’s ability to comprehensively capture all variables influencing crop growth stages. This limitation could affect the accuracy and thoroughness of the classification. Future research could benefit from integrating multi-source data, including soil, climate, and management practices, to develop a more comprehensive growth stage classification model. Additionally, incorporating maize stem diameters may enhance classification accuracy. In terms of model selection, while the machine learning models we used achieved high classification accuracies, due to the nature of the data, ensemble learning models with superior performance and stronger generalization capabilities did not yield satisfactory estimation results. In further research, we can consider introducing various ensemble learning models to achieve better research outcomes. Additionally, in this study, the CH extraction accuracy needs further improvement. Future research could employ high-precision ground control points to enhance plant height extraction accuracy. Estimations of the LAI and FVC could benefit from incorporating deep features to further enhance the accuracy.

6. Conclusions

In this study, we developed a method for classifying maize growth stages during the vegetative growth stage by combining maize phenotypic traits with different classification algorithms. Additionally, we compared the accuracy of traditional classification methods based on VI information for extracting maize growth stage information. The conclusions of this study are as follows:
Combining VI and TF to estimate maize phenotypic traits yielded better results than using VI or TF alone (Table 7). Utilizing VI+TF as input features, the RFR, GPR, and LR regression models achieved optimal estimation results for the LCC (R2: 0.920, RMSE: 3.655 SPAD units, and MAE: 2.698 SPAD units), LAI (R2: 0.621, RMSE: 0.494, and MAE: 0.397), and FVC (R2: 0.777, RMSE:0.051, MAE: 0.040), respectively.
Using the RF classifier and combining maize phenotypic traits (LCC, LAI, FVC, and CH) as input for classifying maize growth stages, we achieved the highest classification accuracy (accuracy: 0.951). This method significantly outperformed VI-based methods for growth stage classification (accuracy: 0.904), enabling more precise classifications of maize vegetative growth stages.
Future research requires more comprehensive data collection, such as incorporating maize stem diameter information. To improve the CH extraction accuracy, future research can attempt to introduce high-precision ground control points. In terms of estimating maize phenotypic traits, future research can attempt to introduce deeper features to improve the accuracy of the estimations. Additionally, future research could expand the study area and conduct controlled experiments in different regions to evaluate the applicability and robustness of this technology under different environmental conditions. This would further enhance its ability to provide decision support for the management of agricultural production.

Author Contributions

Conceptualization, Y.Y. and J.Y.; data curation, Y.Y., J.Y., Y.L., H.Y., H.F., J.S., J.H. and Q.L.; funding acquisition, Y.Y. and J.Y.; methodology, Y.Y. and J.Y.; software, Y.Y.; validation, Y.Y.; writing—original draft, Y.Y. and J.Y.; writing—review and editing, Y.Y., J.Y., Y.L., H.Y., H.F., J.S., J.H. and Q.L. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the National Natural Science Foundation of China (42101362, 32271993, 42371373), the Henan Province Science and Technology Research Project (232102111123, 222103810024), and the Joint Fund of Science and Technology Research Development program (Cultivation project of preponderant discipline) of Henan Province (222301420114).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The raw/processed data required to reproduce the above findings cannot be shared at this time as the data also form part of an ongoing study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Muthuvel, D.; Sivakumar, B.; Mahesha, A. Future global concurrent droughts and their effects on maize yield. Sci. Total Environ. 2023, 855, 158860. [Google Scholar] [CrossRef]
  2. Ahmed, S.; Basu, N.; Nicholson, C.E.; Rutter, S.R.; Marshall, J.R.; Perry, J.J.; Dean, J.R. Use of machine learning for monitoring the growth stages of an agricultural crop. Sustain. Food Technol. 2024, 2, 104–125. [Google Scholar] [CrossRef]
  3. Wang, H.; Magagi, R.; Goïta, K.; Trudel, M.; McNairn, H.; Powers, J. Crop phenology retrieval via polarimetric SAR decomposition and Random Forest algorithm. Remote Sens. Environ. 2019, 231, 111234. [Google Scholar] [CrossRef]
  4. Guo, Y.; Fu, Y.H.; Chen, S.; Bryant, C.R.; Li, X.; Senthilnath, J.; Sun, H.; Wang, S.; Wu, Z.; de Beurs, K. Integrating spectral and textural information for identifying the tasseling date of summer maize using UAV based RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102435. [Google Scholar] [CrossRef]
  5. Noshiri, N.; Beck, M.A.; Bidinosti, C.P.; Henry, C.J. A comprehensive review of 3D convolutional neural network-based classification techniques of diseased and defective crops using non-UAV-based hyperspectral images. Smart Agric. Technol. 2023, 5, 100316. [Google Scholar] [CrossRef]
  6. Pepe, M.; Pompilio, L.; Ranghetti, L.; Nutini, F.; Boschetti, M. Mapping spatial distribution of crop residues using PRISMA satellite imaging spectroscopy. Eur. J. Remote Sens. 2023, 56, 2122872. [Google Scholar] [CrossRef]
  7. Wardlow, B.D.; Egbert, S.L.; Kastens, J.H. Analysis of time-series MODIS 250 m vegetation index data for crop classification in the US Central Great Plains. Remote Sens. Environ. 2007, 108, 290–310. [Google Scholar] [CrossRef]
  8. You, X.; Meng, J.; Zhang, M.; Dong, T. Remote sensing based detection of crop phenology for agricultural zones in China using a new threshold method. Remote Sens. 2013, 5, 3190–3211. [Google Scholar] [CrossRef]
  9. Helman, D.; Lensky, I.M.; Tessler, N.; Osem, Y. A phenology-based method for monitoring woody and herbaceous vegetation in Mediterranean forests from NDVI time series. Remote Sens. 2015, 7, 12314–12335. [Google Scholar] [CrossRef]
  10. Hu, J.; Feng, H.; Wang, Q.; Shen, J.; Wang, J.; Liu, Y.; Feng, H.; Yang, H.; Guo, W.; Qiao, H. Pretrained Deep Learning Networks and Multispectral Imagery Enhance Maize LCC, FVC, and Maturity Estimation. Remote Sens. 2024, 16, 784. [Google Scholar] [CrossRef]
  11. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  12. Yue, J.; Tian, J.; Philpot, W.; Tian, Q.; Feng, H.; Fu, Y. VNAI-NDVI-space and polar coordinate method for assessing crop leaf chlorophyll content and fractional cover. Comput. Electron. Agric. 2023, 207, 107758. [Google Scholar] [CrossRef]
  13. Yue, J.; Yang, H.; Yang, G.; Fu, Y.; Wang, H.; Zhou, C. Estimating vertically growing crop above-ground biomass based on UAV remote sensing. Comput. Electron. Agric. 2023, 205, 107627. [Google Scholar] [CrossRef]
  14. Dutta Gupta, S.; Ibaraki, Y.; Pattanayak, A. Development of a digital image analysis method for real-time estimation of chlorophyll content in micropropagated potato plants. Plant Biotechnol. Rep. 2013, 7, 91–97. [Google Scholar] [CrossRef]
  15. Shu, M.; Fei, S.; Zhang, B.; Yang, X.; Guo, Y.; Li, B.; Ma, Y. Application of UAV multisensor data and ensemble approach for high-throughput estimation of maize phenotyping traits. Plant Phenomics 2022, 2022, 9802585. [Google Scholar] [CrossRef]
  16. Yue, J.; Yang, H.; Feng, H.; Han, S.; Zhou, C.; Fu, Y.; Guo, W.; Ma, X.; Qiao, H.; Yang, G. Hyperspectral-to-image transform and CNN transfer learning enhancing soybean LCC estimation. Comput. Electron. Agric. 2023, 211, 108011. [Google Scholar] [CrossRef]
  17. Lykhovyd, P.V.; Ushkarenko, V.O.; Lavrenko, S.O.; Lavrenko, N.M.; Zhuikov, O.H.; Mrynskyi, I.M.; Didenko, N.O. Leaf area index of sweet corn (Zea mays ssp. saccharata L.) crops depending on cultivation technology in the drip-irrigated conditions of the south of Ukraine. Mod. Phytomorphology 2019, 13, 1–5. [Google Scholar]
  18. Mu, X.; Zhao, T.; Ruan, G.; Song, J.; Wang, J.; Yan, G.; Mcvicar, T.R.; Yan, K.; Gao, Z.; Liu, Y. High spatial resolution and high temporal frequency (30-m/15-day) fractional vegetation cover estimation over China using multiple remote sensing datasets: Method development and validation. J. Meteorol. Res. 2021, 35, 128–147. [Google Scholar] [CrossRef]
  19. Muñoz, J.M.; Bolaños, M.A.; Palacios, E.; Palacios, L.A.; Salvador, J.M. Estimation of the vegetal cover fraction in corn from information obtained with remote sensing. Tecnol. Y Cienc. Del Agua 2023, 14, 331–364. [Google Scholar] [CrossRef]
  20. Gilliot, J.-M.; Michelin, J.; Hadjard, D.; Houot, S. An accurate method for predicting spatial variability of maize yield from UAV-based plant height estimation: A tool for monitoring agronomic field experiments. Precis. Agric. 2021, 22, 897–921. [Google Scholar] [CrossRef]
  21. Almond, S.; Dash, J.; Boyd, D.; Curran, P. Evaluation of the performance of vegetation indices to estimate canopy chlorophyll content for change in soil backgrounds and viewing geometries. In Proceedings of the RSPSoc, Annual Conference: Measuring change in the Earth System, Falmouth, UK, 14–16 September 2008. [Google Scholar]
  22. Jay, S.; Baret, F.; Dutartre, D.; Malatesta, G.; Héno, S.; Comar, A.; Weiss, M.; Maupas, F. Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in sugar beet crops. Remote Sens. Environ. 2019, 231, 110898. [Google Scholar] [CrossRef]
  23. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  24. Campos-Taberner, M.; García-Haro, F.J.; Busetto, L.; Ranghetti, L.; Martínez, B.; Gilabert, M.A.; Camps-Valls, G.; Camacho, F.; Boschetti, M. A critical comparison of remote sensing Leaf Area Index estimates over rice-cultivated areas: From Sentinel-2 and Landsat-7/8 to MODIS, GEOV1 and EUMETSAT polar system. Remote Sens. 2018, 10, 763. [Google Scholar] [CrossRef]
  25. Hu, J.; Yue, J.; Xu, X.; Han, S.; Sun, T.; Liu, Y. UAV-Based Remote Sensing for Soybean FVC, LCC, and Maturity Monitoring. Agriculture 2023, 13, 692. [Google Scholar] [CrossRef]
  26. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ. 1974, 351, 309. [Google Scholar]
  27. Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
  28. Zurita-Milla, R.; Kaiser, G.; Clevers, J.; Schneider, W.; Schaepman, M.E. Downscaling time series of MERIS full resolution data to monitor vegetation seasonal dynamics. Remote Sens. Environ. 2009, 113, 1874–1885. [Google Scholar] [CrossRef]
  29. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  30. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  31. Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef]
  32. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  33. Uribeetxebarria, A.; Castellón, A.; Aizpurua, A. Optimizing wheat yield prediction integrating data from Sentinel-1 and Sentinel-2 with CatBoost algorithm. Remote Sens. 2023, 15, 1640. [Google Scholar] [CrossRef]
  34. Rasmussen, C.E.; Nickisch, H. Gaussian processes for machine learning (GPML) toolbox. J. Mach. Learn. Res. 2010, 11, 3011–3015. [Google Scholar]
  35. Motulsky, H.; Christopoulos, A. Fitting Models to Biological Data Using Linear and Nonlinear Regression: A Practical Guide to Curve Fitting; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
  36. Hoerl, A.E.; Kennard, R.W. Ridge regression: Applications to nonorthogonal problems. Technometrics 1970, 12, 69–82. [Google Scholar] [CrossRef]
  37. Biau, G.; Devroye, L. On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification. J. Multivar. Anal. 2010, 101, 2499–2518. [Google Scholar] [CrossRef]
  38. Lin, H.-T. Introduction to Support Vector Machines; Learning System Group, Califonia Institute of Technology: Pasadena, CA, USA, 2005. [Google Scholar]
  39. Feng, C.; Zhang, W.; Deng, H.; Dong, L.; Zhang, H.; Tang, L.; Zheng, Y.; Zhao, Z. A Combination of OBIA and Random Forest Based on Visible UAV Remote Sensing for Accurately Extracted Information about Weeds in Areas with Different Weed Densities in Farmland. Remote Sens. 2023, 15, 4696. [Google Scholar] [CrossRef]
  40. Rigatti, S.J. Random forest. J. Insur. Med. 2017, 47, 31–39. [Google Scholar] [CrossRef] [PubMed]
  41. Delashmit, W.H.; Manry, M.T. Recent developments in multilayer perceptron neural networks. In Proceedings of the Seventh Annual Memphis Area Engineering and Science Conference, MAESC, Memphis, TN, USA, 11 May 2005; p. 33. [Google Scholar]
  42. Zhang, M.-L.; Peña, J.M.; Robles, V. Feature selection for multi-label naive Bayes classification. Inf. Sci. 2009, 179, 3218–3229. [Google Scholar] [CrossRef]
  43. Ruan, Q.; Wu, Q.; Wang, Y.; Liu, X.; Miao, F. Effective learning model of user classification based on ensemble learning algorithms. Computing 2019, 101, 531–545. [Google Scholar] [CrossRef]
  44. Mouafik, M.; Fouad, M.; Audet, F.A.; El Aboudi, A. Comparative Analysis of Multi-Source Data for Machine Learning-Based LAI Estimation in Argania spinosa. Adv. Space Res. 2024, 73, 4976–4987. [Google Scholar] [CrossRef]
  45. de Magalhães, L.P.; Rossi, F. Use of Indices in RGB and Random Forest Regression to Measure the Leaf Area Index in Maize. Agronomy 2024, 14, 750. [Google Scholar] [CrossRef]
  46. Liu, S.; Jin, X.; Bai, Y.; Wu, W.; Cui, N.; Cheng, M.; Liu, Y.; Meng, L.; Jia, X.; Nie, C. UAV multispectral images for accurate estimation of the maize LAI considering the effect of soil background. Int. J. Appl. Earth Obs. Geoinf. 2023, 121, 103383. [Google Scholar] [CrossRef]
  47. Li, L.; Mu, X.; Jiang, H.; Chianucci, F.; Hu, R.; Song, W.; Qi, J.; Liu, S.; Zhou, J.; Chen, L. Review of ground and aerial methods for vegetation cover fraction (fCover) and related quantities estimation: Definitions, advances, challenges, and future perspectives. ISPRS J. Photogramm. Remote Sens. 2023, 199, 133–156. [Google Scholar] [CrossRef]
  48. Guo, Y.; Xiao, Y.; Li, M.; Hao, F.; Zhang, X.; Sun, H.; de Beurs, K.; Fu, Y.H.; He, Y. Identifying crop phenology using maize height constructed from multi-sources images. Int. J. Appl. Earth Obs. Geoinf. 2022, 115, 103121. [Google Scholar] [CrossRef]
  49. Ferraz, M.A.J.; Barboza, T.O.C.; Arantes, P.d.S.; Von Pinho, R.G.; Santos, A.F.d. Integrating Satellite and UAV Technologies for Maize Plant Height Estimation Using Advanced Machine Learning. AgriEngineering 2024, 6, 20–33. [Google Scholar] [CrossRef]
  50. Zhai, W.; Li, C.; Cheng, Q.; Ding, F.; Chen, Z. Exploring multisource feature fusion and stacking ensemble learning for accurate estimation of maize chlorophyll content using unmanned aerial vehicle remote sensing. Remote Sens. 2023, 15, 3454. [Google Scholar] [CrossRef]
  51. Rosle, R.; Che’Ya, N.; Roslin, N.; Halip, R.; Ismail, M. Monitoring early stage of rice crops growth using normalized difference vegetation index generated from UAV. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Kuala Lumpur, Malaysia, 8–10 December 2019; p. 012066. [Google Scholar]
  52. Yang, Y.; Tao, B.; Liang, L.; Huang, Y.; Matocha, C.; Lee, C.D.; Sama, M.; Masri, B.E.; Ren, W. Detecting recent crop phenology dynamics in corn and soybean cropping systems of Kentucky. Remote Sens. 2021, 13, 1615. [Google Scholar] [CrossRef]
  53. Murguia-Cozar, A.; Macedo-Cruz, A.; Fernandez-Reynoso, D.S.; Salgado Transito, J.A. Recognition of Maize Phenology in Sentinel Images with Machine Learning. Sensors 2021, 22, 94. [Google Scholar] [CrossRef] [PubMed]
  54. Ye, J.; Bao, W.; Liao, C.; Chen, D.; Hu, H. Corn phenology detection using the derivative dynamic time warping method and sentinel-2 time series. Remote Sens. 2023, 15, 3456. [Google Scholar] [CrossRef]
  55. Ibarra, E.S.; Bustamante, W.O.; Capurata, R.E.O.; Cohen, I.S. Improving the monitoring of corn phenology in large agricultural areas using remote sensing data series. Span. J. Agric. Res. 2020, 18, 23. [Google Scholar]
Figure 1. Study area and experimental maize field. (a) Henan Province, China; (b) Xingyang city; and (c) the experimental maize field.
Figure 1. Study area and experimental maize field. (a) Henan Province, China; (b) Xingyang city; and (c) the experimental maize field.
Agriculture 14 01175 g001
Figure 2. Sample photos of the maize vegetation growth stage. (a) Emergence stage; (b) Three-leaf stage; (c) Jointing stage; (d) Small trumpet stage; (e) Big trumpet stage; and (f) Tasseling stage.
Figure 2. Sample photos of the maize vegetation growth stage. (a) Emergence stage; (b) Three-leaf stage; (c) Jointing stage; (d) Small trumpet stage; (e) Big trumpet stage; and (f) Tasseling stage.
Agriculture 14 01175 g002
Figure 4. Classification of maize growth stages based on the MLP.
Figure 4. Classification of maize growth stages based on the MLP.
Agriculture 14 01175 g004
Figure 5. Correlation analysis between VIs and (a) the LCC, (b) the LAI, and (c) the FVC.
Figure 5. Correlation analysis between VIs and (a) the LCC, (b) the LAI, and (c) the FVC.
Agriculture 14 01175 g005
Figure 6. Correlation analysis between TFs and (a) the LCC, (b) the LAI, and (c) the FVC.
Figure 6. Correlation analysis between TFs and (a) the LCC, (b) the LAI, and (c) the FVC.
Agriculture 14 01175 g006
Figure 7. Scatter plots for estimating maize phenotypic traits. (a) The LCC, (b) LAI, and (c) FVC.
Figure 7. Scatter plots for estimating maize phenotypic traits. (a) The LCC, (b) LAI, and (c) FVC.
Agriculture 14 01175 g007
Figure 8. Comparison between the measured and estimated CHs.
Figure 8. Comparison between the measured and estimated CHs.
Agriculture 14 01175 g008
Figure 9. Classification of maize growth stages based on the RF.
Figure 9. Classification of maize growth stages based on the RF.
Agriculture 14 01175 g009
Figure 10. Maize growth stage information maps. (a) Maize canopy digital images and (b) maize growth stages.
Figure 10. Maize growth stage information maps. (a) Maize canopy digital images and (b) maize growth stages.
Agriculture 14 01175 g010
Table 1. Field measurement results for LCC, LAI, and CH.
Table 1. Field measurement results for LCC, LAI, and CH.
StageLCC (SPAD Units)LAICH (cm)
NumMaxMinMeanNumMaxMinMeanNumMaxMinMean
P1 (6.30)80---80---80---
P2 (7.05)80---80---8077.0047.0062.00
P3 (7.08)8059.3040.4051.2080---8082.0050.0066.00
P4 (7.16)8054.1043.4054.30803.881.192.8080144.0080.00106.00
P5 (7.22)8062.0045.6052.60803.911.912.9680185.00129.00153.00
P6 (7.27)8061.5047.4053.98804.172.213.1180217.00160.00190.00
P7 (8.10)8066.7054.0060.20805.462.703.8780250.00195.00225.00
Total56066.7040.4054.465605.461.192.55560250.0047.00133.67
Note: During the P1 stage, which corresponds to the emergence stage, no measurements were taken. During stages P2-P3, the maize seedlings were too small to measure the LAIs; hence, the unmeasured data are marked as “-”.
Table 2. Vegetation indices.
Table 2. Vegetation indices.
NameCalculationReference
NDVI(NIR − R)/(NIR + R)[26]
NDRE(NIR − RE)/(NIR + RE)[27]
LCI(NIR − RE)/(NIR + R)[28]
OSAVI1.16(NIR − R)/(NIR + R + 0.16)[29]
GNDVI(NIR − G)/(NIR + G)[30]
Table 3. Texture features.
Table 3. Texture features.
Name (Abbreviation)Calculation
Mean (Mea) i = 0 N 1 j = 0 N 1 p i , j × i
Variance (Var) i = 0 N 1 j = 0 N 1 p i , j × i m e a n 2
Homogeneity (Hom) i = 0 N 1 j = 0 N 1 p i , j × 1 1 + i j 2
Contrast (Con) i = 0 N 1 j = 0 N 1 p i , j × i j 2
Dissimilarity (Dis) i = 0 N 1 j = 0 N 1 p i , j × i j
Entropy (Ent) i = 0 N 1 j = 0 N 1 p i , j × log p i , j
Angular Second Moment (ASM) i = 0 N 1 j = 0 N 1 p i , j 2
Correlation (Cor) i = 0 N 1 j = 0 N 1 i m e a n j m e a n × p i , j 2 v a r i a n c e
Note: i and j represent the row and column numbers of the image, respectively; p (i, j) denotes the relative frequency of two adjacent pixels.
Table 4. Maize growth stage classification based on VIs and machine learning.
Table 4. Maize growth stage classification based on VIs and machine learning.
ModelAccuracyPrecisionRecallF1
RF0.8770.8810.8770.878
SVM0.8970.8970.8970.897
MLP0.9040.9070.9040.904
NB0.8390.8500.8390.841
Stacking0.8910.8890.8910.889
Table 5. Estimation results based on the VI.
Table 5. Estimation results based on the VI.
TraitsModelR2RMSEMAETraitsModelR2RMSEMAE
LCCCatBoost0.8834.4083.393FVCCatBoost0.6780.0660.048
GPR0.9004.0713.192GPR0.7300.0600.044
LR0.7496.4585.235LR0.7230.0610.047
RR0.6867.2305.730RR0.6190.0710.054
RFR0.8894.3023.257RFR0.6660.0670.048
SVR0.7076.9865.339SVR0.7020.0630.048
KNNR0.8864.3613.346KNNR0.7020.0630.047
LAICatBoost0.6020.5060.401
GPR0.6210.4940.397
LR0.5780.5220.413
RR0.4960.5700.467
RFR0.5940.5120.409
SVR0.5720.5250.408
KNNR0.5990.5080.402
Table 6. Estimation results based on the TFs.
Table 6. Estimation results based on the TFs.
TraitsModelR2RMSEMAETraitsModelR2RMSEMAE
LCCCatBoost0.8934.2233.172FVCCatBoost0.5620.0600.045
GPR0.8634.7673.417GPR0.5710.0600.044
LR0.7366.6245.355LR0.5750.0600.045
RR0.7047.0225.713RR0.5580.0610.045
RFR0.8984.1133.065RFR0.5630.0600.044
SVR0.6937.1435.570SVR0.5740.0600.045
KNNR0.8564.8953.543KNNR0.5140.0640.047
LAICatBoost0.5990.5100.403
GPR0.6130.5010.397
LR0.6200.4960.393
RR0.5440.5430.430
RFR0.6410.4820.378
SVR0.5910.5150.405
KNNR0.6010.5080.396
Table 7. Estimation results based on the VI and TF.
Table 7. Estimation results based on the VI and TF.
TraitsModelR2RMSEMAETraitsModelR2RMSEMAE
LCCCatBoost0.9163.7442.840FVCCatBoost0.7500.0540.041
GPR0.8994.0903.034GPR0.7690.0520.042
LR0.8654.7343.762LR0.7770.0510.040
RR0.7846.0014.895RR0.7360.0560.045
RFR0.9203.6552.698RFR0.7710.0520.040
SVR0.8195.4934.265SVR0.7590.0530.041
KNNR0.9103.8672.923KNNR0.7530.0540.041
LAICatBoost0.6030.5060.401
GPR0.6210.4940.397
LR0.5780.5220.414
RR0.4960.5700.467
RFR0.5940.5120.409
SVR0.5720.5250.408
KNNR0.5990.5080.402
Table 8. Classification accuracy for maize growth stages.
Table 8. Classification accuracy for maize growth stages.
ModelAccuracyPrecisionRecallF1ModelAccuracyPrecisionRecallF1
LCC, LAI, FVC, CHRF0.9510.9510.9510.951LCC, FVC, CHRF0.9420.9420.9420.942
SVM0.9080.9120.9080.909SVM0.9060.9090.9060.906
MLP0.9130.9130.9130.912MLP0.9040.9050.9040.904
NB0.9150.9160.9150.914NB0.9220.9250.9220.922
Stacking0.9460.9450.9460.946Stacking0.9450.9450.9450.945
LAI, FVC, CHRF0.9150.9200.9150.916FVC, CHRF0.8950.8980.8950.895
SVM0.9150.9170.9150.915SVM0.9150.9170.9150.915
MLP0.8860.8940.8860.886MLP0.9090.9080.9090.908
NB0.8640.8670.8640.847NB0.9000.9020.9000.897
Stacking0.9240.9250.9240.924Stacking0.9080.9080.9080.907
LCC, LAI, CHRF0.9130.9130.9130.913LAI, CHRF0.8330.8290.8330.830
SVM0.8950.8980.8950.895SVM0.8370.7600.8370.793
MLP0.8880.8900.8880.889MLP0.8260.8090.8260.787
NB0.8620.8710.8620.842NB0.8420.8400.8420.814
Stacking0.9290.9280.9290.928Stacking0.8620.8590.8620.854
LCC, CHRF0.9060.9060.9040.905
SVM0.9020.9040.9020.902
MLP0.8860.8860.8840.885
NB0.8660.8740.8660.852
Stacking0.9130.9130.9120.911
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yao, Y.; Yue, J.; Liu, Y.; Yang, H.; Feng, H.; Shen, J.; Hu, J.; Liu, Q. Classification of Maize Growth Stages Based on Phenotypic Traits and UAV Remote Sensing. Agriculture 2024, 14, 1175. https://doi.org/10.3390/agriculture14071175

AMA Style

Yao Y, Yue J, Liu Y, Yang H, Feng H, Shen J, Hu J, Liu Q. Classification of Maize Growth Stages Based on Phenotypic Traits and UAV Remote Sensing. Agriculture. 2024; 14(7):1175. https://doi.org/10.3390/agriculture14071175

Chicago/Turabian Style

Yao, Yihan, Jibo Yue, Yang Liu, Hao Yang, Haikuan Feng, Jianing Shen, Jingyu Hu, and Qian Liu. 2024. "Classification of Maize Growth Stages Based on Phenotypic Traits and UAV Remote Sensing" Agriculture 14, no. 7: 1175. https://doi.org/10.3390/agriculture14071175

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop