Next Article in Journal
Pasture Biomass Estimation Using Ultra-High-Resolution RGB UAVs Images and Deep Learning
Previous Article in Journal
Evaluating the Performance of Satellite Derived Temperature and Precipitation Datasets in Ecuador
Previous Article in Special Issue
Early Crop Mapping Using Dynamic Ecoregion Clustering: A USA-Wide Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Combining Texture, Color, and Vegetation Index from Unmanned Aerial Vehicle Multispectral Images to Estimate Winter Wheat Leaf Area Index during the Vegetative Growth Stage

1
Jiangsu Key Laboratory of Crop Genetics and Physiology/Jiangsu Key Laboratory of Crop Cultivation and Physiology, Agricultural College of Yangzhou University, Yangzhou 225009, China
2
Jiangsu Co-Innovation Center for Modern Production Technology of Grain Crops, Yangzhou University, Yangzhou 225009, China
3
Joint International Research Laboratory of Agriculture and Agricultural Product Safety, Yangzhou University, Yangzhou 225009, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2023, 15(24), 5715; https://doi.org/10.3390/rs15245715
Submission received: 4 November 2023 / Revised: 30 November 2023 / Accepted: 30 November 2023 / Published: 13 December 2023

Abstract

:
Leaf Area Index (LAI) is a fundamental indicator of plant growth status in agronomy and environmental research. With the rapid development of drone technology, the estimation of crop LAI based on drone imagery and vegetation indices is becoming increasingly popular. However, there is still a lack of detailed research on the feasibility of using image texture to estimate LAI and the impact of combining texture indices with vegetation indices on LAI estimation accuracy. In this study, two key growth stages of winter wheat (i.e., the stages of green-up and jointing) were selected, and LAI was calculated using digital hemispherical photography. The feasibility of predicting winter wheat LAI was explored under three conditions: vegetation index, texture index, and a combination of vegetation index and texture index, at flight heights of 20 m and 40 m. Two feature selection methods (Lasso and recursive feature elimination) were combined with four machine learning regression models (multiple linear regression, random forest, support vector machine, and backpropagation neural network). The results showed that during the vegetative growth stage of winter wheat, the model combining texture information with vegetation indices performed better than the models using vegetation indices alone or texture information alone. Among them, the best prediction result based on vegetation index was RFECV-MLR at a flight height of 40 m (R2 = 0.8943, RMSE = 0.4139, RRMSE = 0.1304, RPD = 3.0763); the best prediction result based on texture index was RFECV-RF at a flight height of 40 m (R2 = 0.8894, RMSE = 0.4236, RRMSE = 0.1335, RPD = 3.0063); and the best prediction result combining texture and index was RFECV-RF at a flight height of 40 m (R2 = 0.9210, RMSE = 0.3579, RRMSE = 0.1128, RPD = 3.5575). The results of this study demonstrate that combining vegetation indices and texture from multispectral drone imagery can improve the accuracy of LAI estimation during the vegetative growth stage of winter wheat. In addition, selecting a flight height of 40 m can improve efficiency in large-scale agricultural field monitoring, as this study showed that drone data at flight heights of 20 m and 40 m did not significantly affect model accuracy.

Graphical Abstract

1. Introduction

As a fundamental variable in agronomic and environmental studies, leaf area index (LAI) is often used as a key biophysical indicator of vegetation [1]. LAI is widely used in the study of plant photosynthesis [2], nitrogen fertilizer management [3] and water use [4]. LAI also plays a crucial role in practical applications of precision agriculture, including crop growth diagnostics, biomass estimation, and yield prediction. Wheat, as a vital component of the global human diet, provides a substantial source of carbohydrates, proteins, and fiber. Hence, the timely and accurate monitoring of winter wheat LAI information holds significant importance for its growth management and production forecasting.
Traditional methods for studying seasonal variations in vegetation LAI primarily involve direct and optical instrument-based approaches, which rely on intermittent LAI data for seasonal dynamics analysis [5]. Direct methods of measurement include destructive sampling, litterfall method, and point-quadrat method. They are relatively accurate but require a large amount of work and time, are labor-intensive, and can cause damage to vegetation. They cannot obtain continuous LAI data for the same vegetation. Indirect methods utilize optical principles to obtain LAI, with common instruments including LAI-2200, AccuPAR, SunScan, and TRAC, offering advantages such as ease of operation and non-destructiveness. In contrast to more expensive instruments, the fisheye camera method (DHP) is widely used due to its cost-effectiveness and ease of application [6]. However, the use of DHP is subject to external factors like weather and terrain, limiting its ability to collect continuous long-term LAI data. In summary, both direct and indirect methods have various limitations and are unable to provide spatial distribution information for LAI at a regional scale.
On the other hand, satellite platforms provide extensive remote sensing (RS) data. Remote sensing data can capture the reflectance of crop canopies, leading to widespread research on rapid and non-destructive LAI estimation at different perceptual scales using RS technology [7]. However, precision management during the growth season requires a large amount of timely multi-temporal data, and satellites are often constrained by factors such as revisit cycles and weather conditions, making it difficult to obtain a sufficient quantity of high-quality satellite data across multiple crop growth stages.
In recent years, the development of Unmanned Aerial Vehicles (UAVs) and their applications in remote sensing have provided new solutions for LAI estimation. Due to their flexibility in data acquisition and higher temporal and spatial resolutions, the use of UAV platforms for LAI research has gained widespread attention in the global academic community. For example, Zhu et al. [8] investigated lookup tables based on reflectance and vegetation indices at individual growth stages of wheat. Zhu et al. [8] evaluated the performance of different lookup tables for LAI retrieval in wheat. Lee et al. [9] used UAV imagery to estimate rice growth and found no significant correlation between LAI and vegetation indices after rice heading, indicating the challenge of establishing an LAI model that is applicable to multiple growth stages. Gong et al. [10] used several common vegetation indices based on drone imaging and found that using the product of vegetation index and canopy height can estimate LAI throughout the entire rice growing season. Unfortunately, the estimation error of the model reached 24%, and Gong et al. [10] did not consider the influence of drone flight altitude. Zhang et al. [11] established a general model for the jointing stage, heading stage, and grain-filling stage of winter wheat based on UAV hyperspectral data. Zhang’s research considered both the vegetative growth stage (jointing stage) and reproductive growth stage (heading stage and grain-filling stage) of wheat, showing strong applicability. However, Zhang’s study did not reflect the significant changes in LAI during the entire growth period of wheat (from the greening stage to the jointing stage), and Zhang et al. [11] also did not consider the influence of UAV flight altitude.
The above studies indicate that research on crop LAI using UAV platforms has garnered attention, but there is still a lack of remote sensing monitoring research on LAI throughout the entire vegetative growth period of crops. In addition, the choice of UAV flight altitude when utilizing UAVs for crop LAI estimation was not explored in previous studies, and most of the altitude choices in the existing studies were single and always empirically determined. Different UAV flight altitudes are closely related to the endurance time of the UAV and the estimation accuracy of the model, so the selection of different flight altitudes is necessary.
Estimating LAI based on spectral information is the approach used in most studies, and does not fully exploit UAV image information. In remote sensing science, the combination of spectral and texture information is complementary, providing rich information, enhancing classification accuracy, and interpreting image content [12]. Texture information has a close relationship with crop LAI [13].
Texture analysis is an image processing technique used to measure the variability in pixel values between adjacent pixels within a defined analysis window. It was initially applied in image classification and forest biomass estimation [14] in remote sensing images. Later, based on the texture information of remote sensing images, research on forest leaf area index (LAI) was conducted. Pu et al. [15] proposed a pixel-based seasonal LAI regression model using four seasonal Pleiades satellite images and corresponding LAI measurements, selecting a set of selected spectral and texture features. Pu and Cheng et al. [16] found that texture-based features extracted from the same WorldView-2 data had a better ability to estimate and map forest LAI compared to spectral-based features. Bolivar-Santamaria et al. [17] combined field vegetation structure measurements with Sentinel-2 images and used spectral and texture variables derived from Sentinel-2 images to predict LAI. The above studies explored the correlation of forest LAI based on satellite image texture information. Texture information has also been studied in crop growth monitoring. Eckert et al. [13] found that combining spectral features and texture measurements improved biomass estimation compared to using spectral or texture measurements alone. Zheng et al. [18] mounted a six-band spectral camera on an unmanned aerial vehicle (UAV) for rice biomass estimation. Zheng et al. [18] demonstrated that the Normalized Difference Texture Index (NDTI) based on the mean texture of the 550 nm and 800 nm band images outperformed the other texture variables and spectral indices. Subsequently, more studies demonstrated that texture predicted biomass better than spectral variables [19,20]. However, there have been few studies on the use of texture information from UAV multispectral images to estimate crop LAI.
The green-up and jointing stages correspond to the 25–30 and 30–32 stages of the Zadoks [21] scale, respectively. The green-up stage is the second tillering peak of winter wheat, and the jointing stage is a critical period for wheat’s vegetative growth, reproductive growth, and spike differentiation. During the entire reproductive period of winter wheat, the green-up and jointing stages are one of the most significant periods in which the leaf area index changes. In the middle and lower reaches of the Yangtze River, the phenomenon of ‘late spring cold’ occurs frequently [22], the leaves of winter wheat will suffer damage due to the lowering temperature, and large areas of yellowing and wilting may appear. The decrease in the leaf area index of winter wheat seriously affects the crop’s conversion of sunlight energy and its nitrogen utilization efficiency, which in turn affects the future growth rate and final yield of wheat. In this study, we chose this stage (green-up and jointing), and the objective was to explore the predictive performance of the UAV multispectral image-based vegetation index and texture features in winter wheat LAI estimation by collecting UAV multispectral data and wheat canopy hemispherical photography at different flight altitudes. In turn, different feature selection methods were combined with different machine learning algorithms to construct a model that is suitable for LAI estimation of winter wheat at the vegetative growth stage. In addition, to improve the generalization ability of the model, several wheat varieties were introduced to the study.

2. Materials and Methods

2.1. Experimental Site and Design

During the winter of 2022–2023, we conducted our study at the Integrated Demonstration Base of Modern Agricultural Science and Technology in Jiangyan District, Taizhou City, Jiangsu Province, China. The demonstration site is in the middle and lower reaches of the Yangtze River, with a flat topography, and is where the ‘late spring cold’ phenomenon occurs. The study area was divided into 72 plots (as shown in Figure 1), each with a planting area of 12 m2 and a planting spacing of 25 cm between rows, of which 24 plots were used for experiment I and the remaining 48 plots were used for experiment II. Both experiments focused on the management of nitrogen fertilizer in winter wheat, and the method of nitrogen fertilizer application and treatment protocols were the differences between these two experiments.
The first 24 plots (Experiment 1) focused on the growth of winter wheat under different methods of N fertilizer application. The wheat varieties were Yangmai 39 (YM 39) and Yangmai 22 (YM 22). Urea and resin-coated urea were selected for nitrogen fertilizer, and the four different methods of nitrogen fertilizer application were spreading application, inter-row mix, inter-row application, and inter-row mix. The fertilizer application rate was 240 kg per hectare. The application rate of N fertilizer was 240 kg/ha. The experiment was conducted in a split-zone design with variety as the main block, fertilizer type as the sub-block, and fertilizer application method as the sub-sub-block. The experiment was conducted using different wheat varieties and different fertilizer types according to the different fertilizer application methods, with three replications, and the target plant density was set at 240,000 plants per hectare. Others were managed as per standard cultivation.
The latter 48 plots (Experiment II) were planted with four varieties of wheat: Yangmai 25 (YM 25), YM 39, Ningmai 26 (NM 26) and YM 22. Different wheat varieties have different nitrogen utilization efficiencies (YM 22 and NM 26 are nitrogen-inefficient utilizers, and YM 25 and YM 39 are nitrogen-efficient utilizers). This leads to strong differences in leaf area index at all growth stages of these four winter wheat varieties. These findings will have a positive impact on the generalizability of the research results. During the experiment, four different nitrogen fertilizer treatments of 0 kg/ha, 150 kg/ha, 240 kg/ha and 330 kg/ha were used. The experiment was replicated three times for different nitrogen fertilizer treatments. Nitrogen fertilizer was applied in the ratio of 5:1:2:2 in accordance with the base, tillering, jointing, and heading fertilizers, respectively. The basal fertilizer is applied before rotary tillage and sowing, the tillering fertilizer is applied when wheat grows to the three-leaf stage, and the jointing fertilizer is applied when wheat leaf residue reaches 2.5. The tasseling fertilizer is applied when wheat leaf residue drops to 0.8. Phosphorus and potash fertilizers were applied in the form of P2O5 and K2O, and, for all treatment groups, pure phosphorus, and potassium were applied at a rate of 135 kg per hectare and applied as a one-time basal fertilizer. In the experiment, wheat was planted with a 25 cm inter-row spacing using manual furrow sowing. The area of each plot was 12 m² and the experiment was replicated three times. Plant counts were conducted when the wheat reached the two-leaf stage to achieve the target plant density of 240,000 plants per hectare. Other field management practices followed standard farm practices.

2.2. Data Collection and Processing

2.2.1. UAV Image Acquisition and Processing

A DJI P4M drone was chosen for the experiment, which can collect reflectance in five bands (Blue (B), Green (G), Red (R), Rededge and Near Infrared (NIR)). Unlike common RGB cameras on the market, the DJI P4M drone camera carries five spectral bands to provide a basis for calculating commonly used vegetation indices, while data collection was carried out in the morning with clear weather to avoid cloud shadows and shadows in the multispectral images. Route planning was performed with DJI Ground Station Pro 2.0 software (https://www.dji.com/cn/ground-station-pro (accessed on 17 May 2023)). Two diffuse reflectance standard plates representing 0.5 and 0.75 were placed on the ground before takeoff for radiometric calibration. The specific settings of the UAV flight parameters are shown in Table A1 in Appendix A. After the flight was completed, we processed the images using DJI Terra 2.3 software (https://enterprise.dji.com/cn/dji-terra (accessed on 22 May 2023)) to produce a final single-band reflectance image. Based on the UAV acquisition content, the images were categorized using eCognition 9.0 software (https://geospatial.trimble.com/en (accessed on 19 May 2023). The images were categorized into soil, shadow, and vegetation, and the same categories were merged and masked for soil using ArcMap (ESRI Inc.; Redlands, CA, USA).

2.2.2. Field Leaf Area Index Measurement

The measurement of the Green Plant Area Index (PAI) for the winter wheat canopy was conducted on the same day as the acquisition of the UAV multispectral images. PAI was determined using the Digital Hemispherical Photography (DHP) method. DHP is known for its simplicity, non-destructive nature, and wide application in vegetation studies. Shang et al. [6], for example, used DHP to assess the spatiotemporal variation in crop growth PAI while studying the interactions between plants and environmental conditions. Dong et al. [23], based on DHP measurements of LAI for spring wheat and rapeseed, developed a universal LAI estimation algorithm using red-edge vegetation indices for different crops. In the field measurements, a 10.5 mm fisheye lens and a Nikon D7500 camera were used to capture canopy images. The Nikon D7500 can be equipped with a fisheye lens, which has a very wide field of view, a prerequisite for the digital hemispherical photography method of calculating crop LAI. During each sampling event, uniform photos were taken over the winter wheat canopy. Given the relatively low height of the wheat canopy during the study period, the camera lens was positioned 0.5 m above the canopy, facing downward [24]. Fourteen photos were taken for each experimental plot during the data collection process, ensuring full coverage of the plot. These images were processed in the laboratory using the CanEye 6.495 software [25], which was used to calculate both the Effective Plant Area Index (PAI) and the Total Plant Area Index (PAI). Total PAI is defined as half of the total surface area of plant tissue per unit ground area [26,27]. For wheat, the Green Effective PAI is equivalent to the Leaf Area Index (LAI).

2.2.3. Extracting Texture and Vegetation Indices

In this study, 40 common vegetation indices were selected to estimate the leaf area index (LAI) (Table 1). These indices provide information on vegetation growth, health, etc., such as the Normalized Vegetation Index (NDVI), calculated using reflectance values between the infrared and visible bands, where higher values indicate more lush vegetation; the Green Chlorophyll Index (Cigreen) calculated using reflectance values between the green and red bands correlates with the chlorophyll content because of the relatively high absorption of green light by chlorophyll; and the Optimized Soil Adjusted Vegetation Index (OSAVI), a variant of the Soil Adjusted Vegetation Index (OSAI), which is more likely to provide stable results in areas with highly variable soil types.
Common texture indices, which include contrast, correlation, energy, and homogeneity, were computed based on the Gray-Level Co-occurrence Matrix (GLCM). GLCM is a classic method introduced by Haralick et al. [28] in 1973 for extracting texture features. These texture indices were independently extracted for each sample area based on GLCM, computed for their average characteristics, and the temporal variations of different growth stages’ texture indices were obtained. In this study, eight texture features were extracted from the multispectral images using the “Co-occurrence Measures” function in ENVI 5.3 software, including mean (ME), variance (VA), homogeneity (HO), contrast (CO), dissimilarity (DI), entropy (EN), second moment (SE), and correlation (COR) (Table 2). A window size of 7 × 7 was chosen based on previous research [29] and multiple trials to ensure clear differentiation between soil and wheat pixels, while other parameters were kept at their default values.
Table 1. Forty spectral variables used for LAI estimation in this study.
Table 1. Forty spectral variables used for LAI estimation in this study.
NO.VIFormulaReference
1RR-
2GG-
3BB-
4NIRNIR-
5RededgeRededge-
6INT(R + G + B)/3[30]
7IKAW(R − B)/(R + B)[31]
8IPCA0.994|R − B| + 0.961|G − B| + 0.914|G − R|[32]
9ExR1.4R − G[33]
10ExG2G − R − B[34]
11ExGR2G − R − B − (1.4R − G)[33]
12MGRVI(G2 − R2)/(G2 + R2)[35]
13RGBVI(G2 − B × R)/(G2 + B × R)[35]
14NDVI(NIR − R)/(NIR + R)[36]
15GNDVI(NIR − G)/(NIR + G)[37]
16RVI(NIR/R)[38]
17NDREI(NIR − RE)/(NIR + RE)[39]
18EVI2.5 × (NIR − R)/(1 + NIR − 2.4 × R)[40]
19OSAVI(NIR − R)/(NIR − R + 0.16)[41]
20MCARI[(RE − R) − 0.2 × (RE − G)] × (RE/R)[42]
21TCARI3 × [(RE − R) − 0.2 × (RE − G) × (RE/R)][43]
22NRI(G − R)/(G + R)[44]
23TVISqrt (NDVI + 0.5)[45]
24MSR((NIR/R) − 1) × sqrt (NIR/R + 1)[46]
25SIPI(NIR − B)/(NIR + B)[47]
26PSRI(R − B)/NIR[48]
27CI_reNIR/R − 1[49]
28SAVI(NIR − R)/(NIR + R + 0.5) × 1.5[50]
29CRI1/G + 1/NIR[51]
30NLI(NIR × NIR − R)/(NIR × NIR + R)[52]
31RDVI(NIR − R)/sqrt (NIR + R)[53]
32CI_greenNIR/G − 1[49]
33MTCI(NIR − RE)/(RE − R)[54]
34MTVI11.2[1.2(NIR − G) − 2.5(R − G)][55]
35MTVI2(1.5 × (1.2(NIR − G) − 2.5(R − G)))/sqrt((2 × NIR + 1)2 − (6 × NIR − 5 × sqrt(R)) − 0.5)[55]
36VARI(G − R)/(G + R − B)[49]
37ExB1.4 × B − G[56]
38WI(G − B)/(R − G)[34]
39GLA(2 × G − R − B)/(2 × G + R + B)[36]
40VEGG/RaB(1−a), a = 0.667[57]
Note: B, G, R, rededge, and NIR are the raw values for the five bands of the UAV.

2.3. Feature Variable Screening

Feature selection is particularly important when there is a high degree of correlation among features. Highly correlated features can introduce multicollinearity issues. Therefore, two effective feature selection methods were employed in this study, namely L1 regularization Least Absolute Shrinkage and Selection Operator (Lasso) and Recursive Feature Elimination (RFE).
While in ordinary least squares regression, the objective is to fit the model parameters by minimizing the squared error between the actual observations and the predicted values, Lasso’s [58] regression introduces an additional L1 penalty into the objective function (the second term in Equation (1), the first term being the objective function of ordinary least squares). With the L1 penalty, the tendency is to set the coefficients of the unimportant features to zero, thus automating feature selection. This can help prevent overfitting and improve the generalization ability of the model. Alpha parameter selection can then be set through cross-validation. Thus, Lasso effectively reduces the number of features on which a given solution depends, which is important in the field of compressed perception.
min ω 1 2 n s a m p l e s X ω y 2 2 + α ω 1
The Recursive Feature Elimination (RFE) method [59] utilizes an external estimator that assigns weights to features and selects features by recursively considering smaller and smaller feature sets. Firstly, the estimator is trained on the initial feature set, and the importance of each feature is obtained through specific attributes or callable methods. Subsequently, the least important features are pruned from the current feature set. This process is recursively repeated on the pruned feature set until the desired number of selected features is achieved. The modeling process employed a cross-validated RFE methodology in which the ridge regression algorithm was used as the evaluation coefficient.

2.4. Machine Learning Regression Algorithms

In this study, models for estimating the Leaf Area Index (LAI) of winter wheat during its vegetative growth stages (green-up and jointing stages) were developed. At flight altitudes of 20 m and 40 m, feature selection was performed using two different methods on three different datasets. Subsequently, four different machine learning algorithms were employed to fit the data, resulting in a total of 48 regression models. The model parameters in the study (except for MLR) were constructed using a grid search combined with 5-fold cross-validation. The four algorithms used in the study were Multiple Linear Regression [60], Random Forest [61], Support Vector Machine [62], and Backpropagation Neural Network [63]. The first three of the four machine learning algorithms have been frequently used in previous studies and include linear models as well as nonlinear models. With the use of deep learning algorithms in the field of agriculture, back propagation neural networks were also chosen for fitting in this study so that the comparison and selection of different models can be carried out.

2.4.1. Multiple Linear Regression

Linear regression is a commonly used machine learning algorithm for prediction and modeling, employed to analyze the linear relationship between variables. The fundamental principle is to establish the relationship between input features and output targets by finding the best-fitting line (or plane). Multivariate linear regression (MLR, as shown in Equation (2)) is more than univariate regression; it takes full advantage of the practical application of multivariate inputs, and presents strong interpretability while having a simple form.
L A I = β 0 + β 1 X 1 + β 2 X 2 + β 3 X 3 + + β n X n

2.4.2. Random Forest

Random forests are composed of the autonomous resampling method (Bootstrap) proposed by Breiman et al. [61] and the random subspace method introduced by Kam et al. [64]. They find application in various tasks, such as classification, regression, and clustering. In Random Forest, decision trees are constructed using random subsets of the training data. The ensemble of multiple decision trees in Random Forest aggregates their prediction results through methods like voting or averaging. This aggregation yields more stable and accurate results, reducing the risk of model overfitting.

2.4.3. Support Vector Machine

Support Vector Machine (SVM) is proposed based on the principle of structural risk minimization. The application of SVM to regression prediction is called Support Vector Regression (SVR). To solve the nonlinear problem, SVR transforms the nonlinear problem into a linear problem in a high-dimensional space, and then uses a kernel function instead of the inner product operation in the high-dimensional space. The parameters of SVR include the “gamma” (kernel parameter) and C (penalty coefficient). In the study, a linear kernel function is used to find the optimal parameter C through a grid search.

2.4.4. Backpropagation Neural Network

In the groundbreaking paper published in 1986 by David Rumelhart, Geoffrey Hinton, and Ronald Williams [65], the backpropagation training algorithm was introduced. This algorithm provided significant assistance in training multilayer perceptrons and led to the rapid development of backpropagation neural networks. It is an efficient technique for automatically computing gradient descent. The backpropagation algorithm is capable of calculating the gradient of network error with respect to each model parameter in just two passes through the network (one forward pass and one backward pass). In other words, it can determine how each connection weight and bias should be adjusted to minimize the error. Once these gradients are obtained, it follows the standard gradient descent steps and repeats the entire process until the network converges to a solution. Backpropagation neural networks typically consist of an input layer, an output layer, and several hidden layers. Neurons in each layer are connected to neurons in the next layer, transmitting information. In each neuron, the output value of the previous layer of neurons is linearly weighted as the input value, which is processed by a nonlinear activation function and used as the input value of the next layer of neurons.

2.4.5. Cross-Validation and Grid Search

The main idea of cross-validation is to divide the dataset into K parts, where K-1 parts are used as the training set and the remaining 1 part is used as the validation set. In the study, the value of K was set to 5. Grid search is a method of tuning model parameters by exhaustive search and is usually combined with cross-validation to optimize model parameters. In selecting all candidate parameter combinations, the estimates of model CVs under each parameter combination were obtained by loop traversal, and the parameter combination with the best estimates was the final selected parameter. In the study, the RF model was optimized with the parameter “n-estimators”, the SVR model was optimized with the parameter “C”; the BP neural network model, the optimized parameters were “n-estimators” and “learning rate”.

2.5. Statistical Analysis

The post-data collection statistical analysis process is summarized in Figure 2’s flowchart. Samples with the same height from two periods were merged (n = 144), and the overall dataset was divided into training and testing datasets in an 8:2 ratio. Given the significant differences in winter wheat leaf area index under different nitrogen application levels, a stratified sampling approach was employed to partition the training and testing datasets. This was to avoid potential imbalances in sample counts for certain categories due to random sampling, which could affect model training and performance. Most features covering different experimental treatments were used to construct and fine-tune multiple regression models, including Multivariate Linear Regression (MLR), Random Forest (RF), Support Vector Machine Regression (SVM), and Backpropagation Neural Network (BPNN) regression models, using selected vegetation indices, texture, or a combination of both as inputs. Overfitting is prone to occur with a limited dataset. When the model adapts to the noise in the training data and the characteristics of a particular sample, the generalization performance of the model can be drastically reduced. Cross-validation can be employed to more comprehensively assess the model’s performance and generalization ability. The training process involved 5-fold cross-validation combined with grid search to find the optimal model hyperparameters and the final predictive performance was evaluated using the testing dataset. The evaluation metrics included the coefficient of determination, determined through cross-validation (R2, Equation (3)), root mean square error (RMSE, Equation (4)), and relative root mean square error (rRMSE, Equation (5)). R2 measures the extent to which the model explains the variability of the target variable, and its value ranges from 0 to 1. The closer it is to 1, the better the model fits the data. RMSE measures the difference between the model’s predicted value and the actual value; the smaller the value is, the better the model fits the data. The relative root–mean–square error (rRMSE) takes into account the scaling of the observations on top of the root–mean–square error (rRMSE) which makes the error metrics easier to interpret and compare. In addition, considering the balance between the predictive performance of the model and the variability of the data, the the ratio of the standard deviation of the measurements to the RMSE (RPD) was calculated in this paper. Rossel et al. [66] showed that a 1.8 ≤ RPD < 2.0 indicates that it can be used for general prediction tasks, 2.0 ≤ RPD < 2.5 indicates that it can be used for accurate prediction tasks, and RPD ≥ 2.5 indicates very high prediction accuracy.
  R 2 = 1 i = 1 n y i y ^ i 2 i = 1 n ( y i y ¯ i ) 2
  R M S E = 1 n i = 1 n y i y ^ i 2
  r R M S E = 1 n i = 1 n y i y ^ i 2 / y ¯

3. Results

3.1. Variability of Winter Wheat Leaf Area Index

The Leaf Area Index (LAI) of winter wheat exhibits significant variations, as anticipated, influenced by factors such as nitrogen fertilizer dosage, crop varieties, and growth stages (Table 3). Across the three principal growth stages encompassing the vegetative growth phases of winter wheat, both the training and testing datasets reveal an increasing LAI trend. In the training dataset, LAI values during the green-up and jointing growth stages range from 0.56 to 5.52, while in the testing dataset, the range extends from 0.73 to 4.96. Furthermore, substantial variability is observed in LAI during these two growth stages, indicating the potential coverage of a wide range of scenarios. This variability also renders the estimation of winter wheat LAI using remote sensing and unmanned aerial vehicle (UAV) data feasible.

3.2. Feature Selection

The heat map (Figure A1 in Appendix A) illustrates that during the vegetative growth stages of winter wheat, namely the green-up and jointing stages, there exists a high degree of correlation among some of the predictive factors, whether they are vegetation indices (VIs) or texture indices. This suggests a potential issue of overfitting when utilizing all VIs or texture indices. High correlations among features can diminish the interpretability of the model, making it challenging to understand the factors influencing predictions. Through feature selection, we can choose the most representative features, reduce feature dimensionality, and enhance both model interpretability and generalization performance.
When employing Lasso linear regression for feature selection, a crucial hyperparameter needs adjustment, namely the regularization parameter (also known as the penalty parameter), typically denoted as α (alpha). α controls the strength of the L1 regularization term, thereby influencing the feature selection process. A larger α value results in more feature coefficients becoming zero, thereby enhancing the effectiveness of feature selection, while a smaller α value reduces sparsity in feature coefficients, retaining more features in the model. There is no universal rule for selecting the alpha parameter, and it is determined based on cross-validation to minimize the root–mean–square error (Figure 3).
The Recursive Feature Elimination (RFE) method is based on the idea of iteratively removing features that contribute less to the model’s performance. In this study, ridge regression (RR) was selected as the evaluation criterion for RFE. The optimal number of predictive factors was determined based on cross-validation to minimize RMSE (Figure 4).
As shown in Table 4, after the Lasso feature selection method, at a flight height of 20 m, 5 spectral variables were retained and 4 texture variables were retained; for spectral variables plus texture variables, 9 were retained. At a flight height of 40 m, 4 spectral variables were retained and 6 texture variables were retained; for spectral variables plus texture variables, 9 were retained. After the RFECV feature selection method, at a flight height of 20 m, 17 spectral variables were retained and 15 texture variables were retained; for spectral variables plus texture variables, 10 were retained. At a flight height of 40 m, 2 spectral variables were retained and 23 for texture variables were retained; for spectral variables plus texture variables, 3 were retained.
Under different flight altitudes and feature selection methods, there are significant differences in the types and quantities of optimal variable selection. However, there are also some features that are not affected by flight altitude and feature selection methods in a single dataset. For example, texture indices r-mean and n-mean are selected multiple times under different altitudes and different feature selection methods. There are also some features that do not appear repeatedly, such as NIR and re-correlation, which are only selected once as optimal variables.

3.3. The Best Model for Predicting Winter Wheat LAI

3.3.1. The Best Model Based on Vegetation Indices

The model based on the vegetation index (Table 5) shows that the RF model and MLR model have the best predictive performance under different flight heights and feature selection methods. At a flight height of 20 m and with Lasso feature selection, the RF model performs as follows on the training set: R2 = 0.9171, RPD = 3.4742, RMSE = 0.3227, and RRMSE = 0.0981. On the test set, R2 = 0.8912, RPD = 3.0313, RMSE = 0.4201, and RRMSE = 0.1324. At a flight height of 40 m and with RFECV selection, the MLR model performs as follows on the training set: R2 = 0.8512, RPD = 2.5923, RMSE = 0.4324, and RRMSE = 0.1315. On the test set, it performs as follows: R2 = 0.8943, RPD = 3.0763, RMSE = 0.4139, and RRMSE = 0.1304. Specifically, when estimating the LAI of winter wheat during the vegetative growth stages (green-up stage and jointing stages) based on the vegetation index, the RFECV-MLR model at a flight height of 40 m is considered the best model.

3.3.2. The Best Model Based on Texture Indices

The model based on texture index (Table 6) shows that the RF model has the best predictive performance under different flight heights and different feature selection methods. Under Lasso feature selection at a flight height of 20 m, the RF model exhibits the following performance indicators: training set R2 = 0.9131, RPD = 3.3923, RMSE = 0.3304, and RRMSE = 0.1005; test set R2 = 0.8792, RPD = 2.8770, RMSE = 0.4426, and RRMSE = 0.1395. Under RFECV feature selection at a flight height of 40 m, the RF model exhibits the following performance indicators: training set R2 = 0.9061, RPD = 3.2633, RMSE = 0.3435, and RRMSE = 0.1044; test set R2 = 0.8894, RPD = 3.0063, RMSE = 0.4236, and RRMSE = 0.1335. Specifically, when estimating the LAI of winter wheat during the vegetative growth stages (green-up and jointing stages) based on the texture index, the RFECV-RF model at a flight height of 40 m is considered the best model.

3.3.3. The Best Model Based on a Combination of Vegetation Indices and Texture Indices

The model based on the vegetation index and texture index (Table 7) shows that the RF model has the best predictive performance under different flight heights and feature selection methods. Under the RFECV feature selection at a flight height of 20 m, the RF model exhibits the following performance indicators: training set R2 = 0.9437, RPD = 4.2145, RMSE = 0.2660, and RRMSE = 0.0809; test set R2 = 0.9073, RPD = 3.2842, RMSE = 0.3877, and RRMSE = 0.1222. Under the RFECV feature selection at a flight height of 40 m, the RF model exhibits the following performance indicators: training set R2 = 0.9366, RPD = 3.9724, RMSE = 0.2822, and RRMSE = 0.0858; test set R2 = 0.9210, RPD = 3.5575, RMSE = 0.3579, and RRMSE = 0.1128. Specifically, when estimating the LAI of winter wheat during the vegetative growth stages (green-up and jointing stages) based on the vegetation index and texture index, the RFECV-RF model at a flight height of 40 m is considered the best model.
A scatter plot is plotted based on measured LAI versus predicted LAI (Figure 5). The data points in the plot are distributed around the neighborhood of the diagonal line, indicating that the model has a high prediction accuracy.

4. Discussion

4.1. Best LAI Inversion Model

This study involved four wheat varieties, four N application rates and five N application methods in 72 plots. There are significant differences in LAI from plot to plot, and there is a complex relationship between LAI and spectral variables. Therefore, this paper, based on machine learning, simultaneously considers texture variables and spectral variables. Four models were constructed under different flight heights (20 m and 40 m) and different datasets (spectral variables, texture variables, and spectral variables combined with texture variables). The results showed that the best model was obtained by using a nonlinear model after adding texture variables (RFECV-RF, R2 = 0.9210, RMSE = 0.3579, RRMSE = 0.1128, RPD = 3.5575). According to the research of Viscarra Rossel et al. [66], Lasso-RF (RPD = 3.5575) achieved an accurate estimation of wheat LAI. Although this study involved multiple wheat varieties and nitrogen application levels and methods, RFECV-RF based on texture and spectral variables can achieve high accuracy in LAI estimation.
This study provides enough spectral and texture variables for feature screening, while previous studies often artificially select a limited number of spectral variables, and the Pearson correlation coefficients between the quantitative spectral variables and the agronomic parameters are simply calculated in the study to determine the correlation of the variables to the parameters. It can be affirmed that when the relationship between the agronomic parameters and the spectral variables is linear, this practice can construct the model with excellent performance. However, when the relationship between agronomic parameters and spectral variables is nonlinear, then the accuracy when constructing the model will be limited. In this study, a comparison of different feature screening methods was conducted while providing a rich set of variables, so that RFECV-RF achieved an excellent performance.
In addition, this study was conducted during the vegetative growth stages of wheat (green-up stage and jointing stage). Unlike models established in single growth stages in previous studies, models spanning multiple growth stages have higher practicality in agricultural production, as they can more accurately predict the overall trend and changes in wheat LAI values while saving time and resources in production.

4.2. Model Comparison

This article presents the final results of using four machine learning algorithms to build models (considering two flight altitudes, three variable dataset inputs, and two variable selection methods). The results show a certain regularity; that is, in the comparison of models under different flight altitudes and different selection methods, RF or MLR becomes the most accurate model in most cases, while SVM and BPNN are not as outstanding as RF and MLR.
BPNN usually consists of an input layer, a hidden layer and an output layer. Each layer that makes up a BPNN contains multiple neurons. The different layers of a BPNN are connected to each other by weighting parameters. Thus, the complexity and performance of a BPNN depends on the number of neurons and layers, and how the weighting parameters are tuned. When the dataset is large, BPNN can adjust its parameters to better capture complex patterns and features in the input data. In this study, the reason BPNN performs worse than simple multiple linear regression may be that when the dataset is small, the parameters cannot be adjusted to the optimal values, leading to overfitting and an inability to improve the model’s generalization ability. In comparison, multiple linear regression has stronger robustness.
The MLR model improves the number of input variables based on simple linear regression, but if all VIs or TIs are used as input variables, this may lead to overfitting problems due to the complex model structure. To avoid this problem, this article conducts feature selection based on previous research. The selected variables greatly reduce the risk of model overfitting while retaining the advantages of linear models. The established linear model has strong interpretability, high computational efficiency, and wide applicability. However, MLR also has limitations and cannot capture nonlinear relationships.
The RF model achieves the best prediction results with the minimum RMSE value among all input processing, which is consistent with the research by Tang et al. [67]. RF is a powerful machine learning algorithm formed by integrating decision trees. Decision trees divide data at nodes in a nonlinear way, which means that RF can capture complex nonlinear relationships between input features. RF uses bootstrap sampling to extract multiple subsets from the original samples, constructs independent decision trees for each subset, and then combines the prediction results from multiple decision trees, overcoming overfitting problems while dealing with outliers or noise. Previous studies have also shown that RF models tend to achieve high accuracy due to their stability and robustness in handling large amounts of data.

4.3. The Combination of Vegetation Indices and Texture Information for Crop LAI Estimation

The spectral reflection characteristics provide the basis for monitoring crops. This study analyzed the applicability of 40 common spectral variables in constructing an LAI estimation model for winter wheat. After LASSO and RRECV feature selection, the predictive models have high accuracy, which validates the research results of Liu et al. [68] and Fu et al. [69]. Texture can be obtained from multispectral images and serves as a key spatial feature, containing information about the canopy surface for crop phenotype studies. Therefore, texture information has been increasingly selected in research (Li et al. [70]; Lu et al. [14]; Sarker and Nichol [71]). Hlatshwayo et al. [29] found that the texture of the red and near-infrared bands is more significantly related to LAI than some vegetation indices. Pu and Cheng et al. [16] found that texture-based features extracted from the same WorldView-2 data have a better ability to estimate and map forest LAI compared to spectral-based features.
For winter wheat LAI, we also constructed machine-learning models based on 40 texture features. The results showed that the accuracy of the texture model is lower than that of vegetation indices. Additionally, the number of texture features is greater than that of vegetation indices under different flight heights and feature selections. Vegetation indices are often combinations of multiple bands, while texture features are pixel statistics in a single band, which limits their predictive ability for vegetation parameters. Observations of the selected vegetation indices reveal that most of them are correlated with the NIR band, which has been shown to be effective in responding to the dynamics of LAI [11]. Examples include NDREI, which analyzes the red and near-infrared bands and correlates with changes in chlorophyll content, leaf area, and background soil. MSR reduces observational noise by analyzing both the near-infrared and green light bands. These reasons may lead to a better predictive performance of vegetation indices compared to texture indices alone. Texture analysis, which extracts color-independent spatial information, helps to identify objects or regions of interest in an image. The selection of appropriate bands and texture information is critical for crop LAI monitoring. Previous studies have shown that combining texture indices and vegetation indices can improve performance. Zheng et al. [18] reported that combining vegetation indices and texture improved the accuracy of rice biomass estimation. Hlatshwayo et al. [29] combined vegetation indices (VI), color indices (CI), and texture indices (TI) using the random forest (RF) method to improve the estimation accuracy of LAI and leaf dry mass (LDM). The results of this study also confirm this point. The best model based on texture variables, the RFECV-RF model (R2 = 0.8894, RMSE = 0.4236, RRMSE = 0.1335, RPD = 3.0063), has significantly lower accuracy than the RFECV-RF model combining vegetation and texture indices (R2 = 0.9210, RMSE = 0.3579, RRMSE = 0.1128, RPD = 3.5575).

4.4. The Impact of Different Flight Heights on Winter Wheat LAI Prediction

Previous studies have often neglected the effects of different flight altitudes when predicting LAI. To investigate the accuracy of winter wheat LAI estimation based on multispectral UAV models in response to different spatial resolutions, different flight altitudes of 20 m (resolution of 1.06 cm) and 40 m (resolution of 2.12 cm) were used in this study. Despite the twofold difference in spatial resolutions at various flight altitudes, the results of winter wheat LAI estimation indicated no significant difference in the accuracy of the model when predicting LAI at 20 m and 40 m flight altitudes. This suggests that the resolution of the image and the accuracy of the model are not proportional when making winter wheat LAI predictions, a finding that is consistent with that of Broge et al. [45]. Zhang et al. [72] also considered the scale effect of calculating vegetation indices and texture indices, and their impact on the estimation of wheat growth parameters (LAI and LDM) using machine learning algorithms. They extracted textures from unmanned aerial images at different heights during the wheat jointing stage, with pixel resolutions of 8 cm (80 m height), 10 cm (100 m), 12 cm (120 m), 15 cm (150 m), and 18 cm (180 m). The results showed that the texture at 80 m height was highly correlated with LAI and LDM, and the correlation remained generally stable at heights of 100 m, 120 m, and 150 m, despite some minor fluctuations. However, the correlation significantly decreased at a height of 180 m. This may be because when the pixel resolution is reduced to a certain extent, there are too many mixed pixels, making it difficult to distinguish between soil and vegetation, resulting in poor performance in crop growth monitoring. Although this study only explored flight heights of 20 m and 40 m, the research results on flight heights have important guiding significance for practical production. For example, the DJI P4M drone used in this study required 45 min to cover the entire study area (0.3 ha) at a height of 20 m, and the battery needing changing halfway. However, at a flight height of 40 m, it only took 15 min to cover the entire study area without the need to change the battery. In addition, not only can long flights deplete the battery, but the drone may also capture photos under different lighting conditions (caused by changes in solar zenith angle and cloud movement). This leads to uneven lighting between images, which affects subsequent data analysis and processing, ultimately affecting the prediction of LAI [73].

4.5. Limiting Factors and Future Research Prospects

When collecting multispectral images, the choices of UAV flight altitude were 20 and 40 m. Higher flight altitudes will be explored in the future to further study the effect of the spatial resolution of UAV multispectral images on the remote sensing monitoring of winter wheat LAI.
Furthermore, this study utilized data from four different winter wheat varieties at the vegetative growth stages (green-up and jointing stages) and 72 experimental plots. Future research can collect data from a broader range of winter wheat varieties across multiple growth stages, employing larger datasets, and establishing models that cover the entire growth cycle. This approach will better accommodate various growth stages, and enhance model applicability and reliability.

5. Conclusions

In this study, the feasibility of predicting winter wheat LAI based on three scenarios, namely, the vegetation index, texture index, and the combination of vegetation index and texture index, was investigated at different flight altitudes (20 m and 40 m) during the vegetative growth stages (green-up and jointing stages) of winter wheat. The results showed that a combination of different feature screening methods and machine learning algorithms can be constructed to estimate winter wheat LAI with high accuracy. The model accuracy of the combination of texture index and vegetation index was higher than that of the vegetation index model alone and the texture index model alone.
In this study, it was found that accurate models for winter wheat estimation could be established at both 20 m and 40 m flight altitudes. Therefore, in practice, we could choose a flight altitude of 40 m. This reduces the flight time relative to the 20 m flight altitude, which results in fewer battery changes and improved image quality.
In this study, a high-accuracy model for estimating the LAI of winter wheat was established by optimizing the UAV flight strategy, as well as combining the multi-feature selection method and machine learning algorithm. The model is applicable to the key stage phases of winter wheat vegetative growth (green-up and jointing stages), as well as to multiple winter wheat varieties, with high generalizability and practicality.
With different datasets, screening methods, and flight altitudes, the prediction accuracy of the RF model was almost always higher than that of the other machine learning algorithms when looking at the four machine learning methods (MLR, RF, SVM, BP) used in this study. However, further enhancement of the database and improvements in the quality of the model are needed to accurately capture the growth metrics of different crops on large-scale agricultural fields and improve the generalizability of the study results across geographic regions and crops.

Author Contributions

Conceptualization, J.W.; formal analysis, W.L. and J.W.; funding acquisition, Z.H. and W.W.; investigation, W.L., Q.Y., Y.Z., J.W., W.W. and G.Z.; methodology, W.L. and J.W.; supervision, J.W. and Z.H.; visualization, W.L., Q.Y. and Y.Z.; writing—original draft, W.L. and J.W.; writing—review and editing, W.L. and J.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Jiangsu Agricultural Science and Technology Innovation Fund (CX (22)1001), the Key Research and Development Program (Modern Agriculture) of Jiangsu Province (BE2020319), National Natural Science Foundation of China (32301937), China Postdoctoral Science Foundation (2021M692721), and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), China.

Data Availability Statement

The data are available from the authors upon reasonable request as the data need further use.

Acknowledgments

Special thanks to Zhi Ding and Junhan Zhang, Agricultural College of Yangzhou University, for their kind help in field surveys.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Settings of UAV Flight Parameters, and Correlation Analysis Heatmaps

Table A1. Settings of UAV flight parameters.
Table A1. Settings of UAV flight parameters.
ParametersSetting 1Setting 2
Flight altitude20 m40 m
Flight speed3 m/s3 m/s
Heading overlap rate80%80%
Sideways overlap80%80%
Resolution1.06 cm2.12 cm
Figure A1. Correlation heatmaps. (a) LAI and vegetation indices plus texture indices at 20 m altitude. (b) LAI and vegetation indices plus texture indices at 40 m altitude.
Figure A1. Correlation heatmaps. (a) LAI and vegetation indices plus texture indices at 20 m altitude. (b) LAI and vegetation indices plus texture indices at 40 m altitude.
Remotesensing 15 05715 g0a1

References

  1. Watson, D. Comparative Physiological Studies on the Growth of Field Crops: II. The Effect of Varying Nutrient Supply on Net Assimilation Rate and Leaf Area. Ann. Bot. 1947, 11, 375–407. [Google Scholar] [CrossRef]
  2. Wells, R. Soybean Growth Response to Plant Density: Relationships among Canopy Photosynthesis, Leaf Area, and Light Interception. Crop. Sci. 1991, 31, 755–761. [Google Scholar] [CrossRef]
  3. Souri, M.K.; Hatamian, M. Aminochelates in plant nutrition: A review. J. Plant Nutr. 2019, 42, 67–78. [Google Scholar] [CrossRef]
  4. Richards, R.; Townley-Smith, T. Variation in leaf area development and its effect on water use, yield and harvest index of droughted wheat. Aust. J. Agric. Res. 1987, 38, 983–992. [Google Scholar] [CrossRef]
  5. Schleppi, P.; Thimonier, A.; Walthert, L. Estimating leaf area index of mature temperate forests using regressions on site and vegetation data. For. Ecol. Manag. 2011, 261, 601–610. [Google Scholar] [CrossRef]
  6. Shang, J.; Liu, J.; Ma, B.; Zhao, T.; Jiao, X.; Geng, X.; Huffman, T.; Kovacs, J.M.; Walters, D. Mapping spatial variability of crop growth conditions using RapidEye data in Northern Ontario, Canada. Remote Sens. Environ. 2015, 168, 113–125. [Google Scholar] [CrossRef]
  7. Zhang, J.; Liu, X.; Liang, Y.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Using a Portable Active Sensor to Monitor Growth Parameters and Predict Grain Yield of Winter Wheat. Sensors 2019, 19, 1108. [Google Scholar] [CrossRef]
  8. Zhu, W.; Sun, Z.; Huang, Y.; Lai, J.; Li, J.; Zhang, J.; Yang, B.; Li, B.; Li, S.; Zhu, K.; et al. Improving Field-Scale Wheat LAI Retrieval Based on UAV Remote-Sensing Observations and Optimized VI-LUTs. Remote Sens. 2019, 11, 2456. [Google Scholar] [CrossRef]
  9. Lee, K.; Kyung, D.; Park, C.-W.; Ho, S.K.; Na, S.-I. Selection of Optimal Vegetation Indices and Regression Model for Estimation of Rice Growth Using UAV Aerial Images. Korean J. Soil Sci. Fertil. 2017, 50, 409–421. [Google Scholar] [CrossRef]
  10. Gong, Y.; Yang, K.; Lin, Z.; Fang, S.; Wu, X.; Zhu, R.; Peng, Y. Remote estimation of leaf area index (LAI) with unmanned aerial vehicle (UAV) imaging for different rice cultivars throughout the entire growing season. Plant Methods 2021, 17, 88. [Google Scholar] [CrossRef]
  11. Zhang, J.; Cheng, T.; Guo, W.; Xu, X.; Qiao, H.; Xie, Y.; Ma, X. Leaf area index estimation model for UAV image hyperspectral data based on wavelength variable selection and machine learning methods. Plant Methods 2021, 17, 49. [Google Scholar] [CrossRef]
  12. Fauvel, M.; Tarabalka, Y.; Benediktsson, J.A.; Chanussot, J.; Tilton, J.C. Advances in spectral-spatial classification of hyperspectral images. Proc. IEEE 2013, 101, 652–675. [Google Scholar] [CrossRef]
  13. Eckert, S. Improved Forest Biomass and Carbon Estimations Using Texture Measures from WorldView-2 Satellite Data. Remote Sens. 2012, 4, 810–829. [Google Scholar] [CrossRef]
  14. Lu, D. Aboveground biomass estimation using Landsat TM data in the Brazilian Amazon. Int. J. Remote Sens. 2005, 26, 2509–2525. [Google Scholar] [CrossRef]
  15. Pu, R.; Landry, S. Evaluating seasonal effect on forest leaf area index mapping using multi-seasonal high resolution satellite pléiades imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 80, 268–279. [Google Scholar] [CrossRef]
  16. Pu, R.; Cheng, J. Mapping forest leaf area index using reflectance and textural information derived from WorldView-2 imagery in a mixed natural forest area in Florida, US. Int. J. Appl. Earth Obs. Geoinf. 2015, 42, 11–23. [Google Scholar] [CrossRef]
  17. Bolívar-Santamaría, S.; Reu, B. Detection and characterization of agroforestry systems in the Colombian Andes using sentinel-2 imagery. Agrofor. Syst. 2021, 95, 499–514. [Google Scholar] [CrossRef]
  18. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  19. Lu, D.; Batistella, M. Exploring TM image texture and its relationships with biomass estimation in Rondônia, Brazilian Amazon. Acta Amaz. 2005, 35, 249–257. [Google Scholar] [CrossRef]
  20. Kelsey, K.C.; Neff, J.C. Estimates of Aboveground Biomass from Texture Analysis of Landsat Imagery. Remote Sens. 2014, 6, 6407–6422. [Google Scholar] [CrossRef]
  21. Zadoks, J.C.; Chang, T.T.; Konzak, C.F. A decimal code for the growth stages of cereals. Weed Res. 1974, 14, 415–421. [Google Scholar] [CrossRef]
  22. Liu, C.; Wang, W.L.; Zhao, C.; Li, G.H.; Xu, K.; Huo, Z.Y. Current status and prospects of research on late spring frost in wheat. Jiangsu J. Agric. Sci. 2022, 38, 1115–1122. [Google Scholar]
  23. Dong, T.; Liu, J.; Shang, J.; Qian, B.; Ma, B.; Kovacs, J.M.; Walters, D.; Jiao, X.; Geng, X.; Shi, Y. Assessment of red-edge vegetation indices for crop leaf area index estimation. Remote Sens. Environ. 2018, 222, 133–143. [Google Scholar] [CrossRef]
  24. Shang, J.; Liu, J.; Huffman, T.; Qian, B.; Pattey, E.; Wang, J.; Zhao, T.; Geng, X.; Kroetsch, D.; Dong, T.; et al. Estimating plant area index for monitoring crop growth dynamics using Landsat-8 and RapidEye images. J. Appl. Remote Sens. 2014, 8, 85196. [Google Scholar] [CrossRef]
  25. Weiss, M.; Baret, F. Can-EyeV6.1 User Manual; EMMAH Laboratory (Mediterranean Environment and Agro-Hydro System Modelisation), National Institute of Agricultural Research (INRA): Paris, France, 2010. [Google Scholar]
  26. Black, T.A.; Chen, J.-M.; Lee, X.; Sagar, R.M. Characteristics of shortwave and longwave irradiances under a Douglas-fir forest stand. Can. J. For. Res. 1991, 21, 1020–1028. [Google Scholar] [CrossRef]
  27. Lang, A. Application of some of Cauchy’s theorems to estimation of surface areas of leaves, needles and branches of plants, and light transmittance. Agric. For. Meteorol. 1991, 55, 191–212. [Google Scholar] [CrossRef]
  28. Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern. 1973, 6, 610–621. [Google Scholar] [CrossRef]
  29. Hlatshwayo, S.T.; Mutanga, O.; Lottering, R.T.; Kiala, Z.; Ismail, R. Mapping forest aboveground biomass in the reforested Buffelsdraai landfill site using texture combinations computed from SPOT-6 pan-sharpened imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 74, 65–77. [Google Scholar] [CrossRef]
  30. Ihmad, I.; Reid, J.F. Evaluation of Colour Representations for Maize Images. J. Agric. Eng. Res. 1995, 63, 185–195. [Google Scholar] [CrossRef]
  31. Kawashima, S.; Nakatani, M. An Algorithm for Estimating Chlorophyll Content in Leaves Using a Video Camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef]
  32. Saberioon, M.M.; Amin, M.S.M.; Anuar, A.R.; Gholizadeh, A.; Wayayok, A.; Khairunniza-Bejo, S. Assessment of rice leaf chlorophyll content using visible bands at different growth stages at both the leaf and canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2014, 32, 35–45. [Google Scholar] [CrossRef]
  33. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  34. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  35. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  36. Rouse, J.W., Jr.; Haas, R.H.; Deering, D.; Schell, J.; Harlan, J.C. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; NASA/GSFCT Type III Final Report, 371; NASA: Washington, DC, USA, 1974. [Google Scholar]
  37. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  38. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  39. Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C. Coincident Detection of Crop Water Stress, Nitrogen Status and Canopy Density Using Ground-Based. 2000. Available online: https://www.tucson.ars.ag.gov/unit/publications/PDFfiles/1356.pdf (accessed on 3 August 2023).
  40. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  41. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  42. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E., III. Estimating Corn Leaf Chlorophyll Concentration from Leaf and Canopy Reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  43. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  44. Schleicher, T.D.; Bausch, W.C.; Delgado, J.A.; Ayers, P.D. Evaluation and Refinement of the Nitrogen Reflectance Index (NRI) for Site-Specific Fertilizer Management. In Proceedings of the 2001 ASAE Annual Meeting, Sacramento, CA, USA, 29 July 29–1 August 2001. [Google Scholar]
  45. Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  46. Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 1996, 22, 229–242. [Google Scholar] [CrossRef]
  47. Peñuelas, J.; Filella, I.; Gamon, J.A. Assessment of photosynthetic radiation-use efficiency with spectral reflectance. New Phytol. 1995, 131, 291–296. [Google Scholar] [CrossRef]
  48. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  49. Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 1248. [Google Scholar] [CrossRef]
  50. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  51. Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing carotenoid content in plant leaves with reflectance spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef]
  52. Goel, N.S.; Qin, W. Influences of canopy architecture on relationships between various vegetation indices and LAI and Fpar: A computer simulation. Remote Sens. Rev. 1994, 10, 309–347. [Google Scholar] [CrossRef]
  53. Ke, W.; Zhangquan, S.; Renchao, W. Effects of nitrogen nutrition on the spectral reflectance characteristics of rice leaf and canopy. Zhejiang Agric. Univ. 1998, 24, 93–97. [Google Scholar]
  54. Dash, J.; Curran, P.J. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  55. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  56. Peñuelas, J.; Gamon, J.; Fredeen, A.; Merino, J.; Field, C. Reflectance indices associated with physiological changes in nitrogen- and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
  57. Hague, T.; Tillett, N.; Wheeler, H. Automated Crop and Weed Monitoring in Widely Spaced Cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  58. Li, F.W.; Lai, L.F.; Cui, S.G. On the Adversarial Robustness of LASSO Based Feature Selection. IEEE Trans. Signal Process. 2021, 69, 5555–5567. [Google Scholar] [CrossRef]
  59. Wang, G.; Sun, L.; Wang, W.; Chen, T.; Guo, M.; Zhang, P. A feature selection method combined with ridge regression and recursive feature elimination in quantitative analysis of laser induced breakdown spectroscopy. Plasma Sci. Technol. 2020, 22, 074002. [Google Scholar] [CrossRef]
  60. Hocking, R.R. A Biometrics Invited Paper. The Analysis and Selection of Variables in Linear Regression. Biometrics 1976, 32, 1–49. [Google Scholar] [CrossRef]
  61. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  62. Brereton, R.G.; Lloyd, G.R. Support Vector Machines for classification and regression. Analyst 2010, 135, 230–267. [Google Scholar] [CrossRef] [PubMed]
  63. Goh, A.T. Back-propagation neural networks for modeling complex systems. Artif. Intell. Eng. 1995, 9, 143–151. [Google Scholar] [CrossRef]
  64. Ho, T.K. The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 1998, 20, 832–844. [Google Scholar]
  65. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation. 1985. Available online: https://stanford.edu/~jlmcc/papers/PDP/Volume%201/Chap8_PDP86.pdf (accessed on 3 August 2023).
  66. Viscarra Rossel, R.A.; Taylor, H.J.; McBratney, A.B. Multivariate calibration of hyperspectral γ-ray energy spectra for proximal soil sensing. Eur. J. Soil Sci. 2007, 58, 343–353. [Google Scholar] [CrossRef]
  67. Tang, X.; Liu, H.; Feng, D.; Zhang, W.; Chang, J.; Li, L.; Yang, L. Prediction of field winter wheat yield using fewer parameters at middle growth stage by linear regression and the BP neural network method. Eur. J. Agron. 2022, 141, 126621. [Google Scholar] [CrossRef]
  68. Liu, S.; Zeng, W.; Wu, L.; Lei, G.; Chen, H.; Gaiser, T.; Srivastava, A.K. Simulating the Leaf Area Index of Rice from Multispectral Images. Remote Sens. 2021, 13, 3663. [Google Scholar] [CrossRef]
  69. Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sens. 2020, 12, 508. [Google Scholar] [CrossRef]
  70. Li, M.; Tan, Y.; Pan, J.; Peng, S. Modeling forest aboveground biomass by combining spectrum, textures and topographic features. Front. For. China 2008, 3, 10–15. [Google Scholar] [CrossRef]
  71. Sarker, L.R.; Nichol, J.E. Improved forest biomass estimates using ALOS AVNIR-2 texture indices. Remote Sens. Environ. 2011, 115, 968–977. [Google Scholar] [CrossRef]
  72. Zhang, J.; Qiu, X.; Wu, Y.; Zhu, Y.; Cao, Q.; Liu, X.; Cao, W. Combining texture, color, and vegetation indices from fixed-wing UAS imagery to estimate wheat growth parameters using multivariate regression methods. Comput. Electron. Agric. 2021, 185, 106138. [Google Scholar] [CrossRef]
  73. de Wasseige, C.; Bastin, D.; Defourny, P. Seasonal variation of tropical forest LAI based on field measurements in Central African Republic. Agric. For. Meteorol. 2003, 119, 181–194. [Google Scholar] [CrossRef]
Figure 1. The location of the study area and the spatial distribution of 72 experimental plots.
Figure 1. The location of the study area and the spatial distribution of 72 experimental plots.
Remotesensing 15 05715 g001
Figure 2. Research experiment and statistical analysis process (Experiments I and II are indicated by yellow and black boxes, respectively).
Figure 2. Research experiment and statistical analysis process (Experiments I and II are indicated by yellow and black boxes, respectively).
Remotesensing 15 05715 g002
Figure 3. Determination of alpha based on cross-validated minimum RMSE in the vegetation index (a), the texture index (b), and the combination of the vegetation index and the texture index (c).
Figure 3. Determination of alpha based on cross-validated minimum RMSE in the vegetation index (a), the texture index (b), and the combination of the vegetation index and the texture index (c).
Remotesensing 15 05715 g003
Figure 4. RMSE varies with the increasing number of features for vegetation indices (a), texture indices (b), and the combination of vegetation and texture indices (c). The optimal number of predictive factors is selected based on the minimum RMSE.
Figure 4. RMSE varies with the increasing number of features for vegetation indices (a), texture indices (b), and the combination of vegetation and texture indices (c). The optimal number of predictive factors is selected based on the minimum RMSE.
Remotesensing 15 05715 g004
Figure 5. The best LAI estimation models in the test set are as follows: (a) Lasso-RF with 20 m VIs as input. (b) Lasso-RF with 20 m TIs as input. (c) RFECV-RF with 20 m VIs and TIs as inputs. (d) RFECV-MLR with 40 m VIs as input. (e) RFECV-RF with 40 m TIs as input. (f) RFECV-RF with 40 m VIs and TIs as inputs.
Figure 5. The best LAI estimation models in the test set are as follows: (a) Lasso-RF with 20 m VIs as input. (b) Lasso-RF with 20 m TIs as input. (c) RFECV-RF with 20 m VIs and TIs as inputs. (d) RFECV-MLR with 40 m VIs as input. (e) RFECV-RF with 40 m TIs as input. (f) RFECV-RF with 40 m VIs and TIs as inputs.
Remotesensing 15 05715 g005
Table 2. Eight texture variables used for LAI estimation in this study.
Table 2. Eight texture variables used for LAI estimation in this study.
Textural IndicesFormulaReference
Mean μ i = i i j = 0 N 1 P i j ; μ j = j i j = 0 N 1 P i j [28]
Variance σ i 2 = i , j = 0 N 1 P i j i μ i 2 ; σ j 2 = i , j = 0 N 1 P i j j μ j 2 [28]
Homogeneity i , j = 0 N 1 P i , j 1 + i j 2 [28]
Contrast i , j = 0 N 1 P i , j i j 2 [28]
Dissimilarity i , j = 0 N 1 P i , j i j [28]
Entropy i , j = 0 N 1 P i , j ( ln P i , j ) [28]
Second Moment i , j = 0 N 1 P i , j [28]
Correlation i , j = 0 N 1 P i , j i μ i ( j μ j ) 1 + i j 2 [28]
Table 3. Descriptive statistics of Leaf Area Index (LAI) in the training and testing datasets.
Table 3. Descriptive statistics of Leaf Area Index (LAI) in the training and testing datasets.
Growth StageSamplesLAI
MinMaxMeanSDCV (%)
Training Dataset
Green-up and Jointing1150.565.523.381.1234.07
Test Dataset
Green-up and Jointing290.734.963.171.2740.13
Table 4. Variable selection results.
Table 4. Variable selection results.
Variable20 m40 m
LassoRFECVLassoREFCV
VITIVI + TIVITIVI + TIVITIVI + TIVITIVI + TI
NIR
ExR
MGRVI
RGBVI
NDVI
GNDVI
NDREI
EVI
OSAVI
MCARI
TCARI
MSR
SAVI
NLI
RDVI
MTCI
MTVI1
GLA
VEG
r-mean
r-homogeneity
r-contrast
r-dissimilarity
r-entropy
r-second moment
r-correlation
g-mean
g-variance
g-homogeneity
g-contrast
g-dissimilarity
g-entropy
g-second moment
g-correlation
b-variance
b-homogeneity
b-contrast
b-dissimilarity
b-entropy
b-second moment
b-correlation
n-mean
n-variance
n-homogeneity
n-contrast
n-dissimilarity
n-entropy
n-correlation
re-dissimilarity
re-entropy
re-correlation
Note: The variables listed in the table are those selected through feature selection.
Table 5. Model results based on vegetation index-selected variables.
Table 5. Model results based on vegetation index-selected variables.
HeightModelTrainTest
R2RMSERRMSERPDR2RMSERRMSERPD
20 mLassoMLR0.86640.40970.12462.73600.71290.68230.21501.8664
RF0.91710.32270.09813.47420.89120.42010.13243.0313
SVM0.84740.43790.13312.55990.79390.57810.18222.2028
BPNN0.89250.37740.11363.05000.61320.76550.24521.6078
RFECVMLR0.88630.37800.11492.96570.66580.73610.23201.7299
RF0.94230.26920.08184.16380.88960.42310.13333.0095
SVM0.87110.40250.12232.78540.78450.59120.18632.1539
BPNN0.86460.42350.12752.71780.75680.60690.19442.0279
40 mLassoMLR0.85380.42860.13032.61530.88770.42680.13452.9837
RF0.92690.30310.09213.69820.88140.43850.13822.9041
SVM0.84930.43510.13232.57630.88120.43890.13832.9016
BPNN0.86880.41700.12552.76040.86880.44580.14282.7607
RFECVMLR0.85120.43240.13152.59230.89430.41390.13043.0763
RF0.90020.35420.10773.16530.87100.45740.14412.7842
SVM0.85000.43420.13202.58190.89430.41400.13053.0759
BPNN0.87340.40960.12332.81040.87310.43840.14042.8076
Note: the best results are in bold.
Table 6. Model results based on texture index-selected variables.
Table 6. Model results based on texture index-selected variables.
HeightModelTrainTest
R2RMSERRMSERPDR2RMSERRMSERPD
20 mLassoMLR0.86010.41930.12752.67360.85760.48050.15142.6501
RF0.91310.33040.10053.39230.87920.44260.13952.8770
SVM0.61740.69340.21081.61670.54260.86120.27141.4786
BPNN0.88100.39710.11952.89870.83130.50540.16192.4350
RFECVMLR0.89080.37040.11263.02680.87570.44900.14152.8361
RF0.92840.30010.09123.73590.87850.44390.13992.8683
SVM0.86070.41830.12722.67970.83490.51740.16302.4613
BPNN0.84770.44920.13522.56270.80350.54560.17482.2559
40 mLassoMLR0.88710.39000.12232.97590.82930.44450.12442.4201
RF0.93810.28860.09054.02050.84710.42070.11772.5572
SVM0.63360.70250.22031.65200.34190.87270.24421.2327
BPNN0.87260.41090.12372.80140.86520.45180.14472.7239
RFECVMLR0.90180.35120.10683.19190.87250.45470.14332.8002
RF0.90610.34350.10443.26330.88940.42360.13353.0063
SVM0.84220.44530.13542.51740.80140.56750.17892.2437
BPNN0.85160.44340.13352.59610.84290.48780.15622.5233
Note: the best results are in bold.
Table 7. Model results based on vegetation and texture index-selected variables.
Table 7. Model results based on vegetation and texture index-selected variables.
HeightModelTrainTest
R2RMSERRMSERPDR2RMSERRMSERPD
20 mLassoMLR0.87740.39250.11932.85630.80860.55710.17562.2857
RF0.93430.28740.08743.90080.90710.38820.12233.2805
SVM0.74140.57010.17331.96630.79940.57030.17972.2329
BPNN0.89220.37790.11373.04630.78760.56720.18172.1700
RFECVMLR0.88350.38270.11632.92940.68910.71000.22371.7935
RF0.94370.26600.08094.21450.90730.38770.12223.2842
SVM0.87340.39890.12132.80990.72450.66830.21061.9053
BPNN0.88020.39840.11992.88930.72250.64840.20771.8982
40 mLassoMLR0.87760.39220.11922.85850.90590.39060.12313.2600
RF0.93880.27730.08434.04260.88060.44000.13872.8943
SVM0.86420.41310.12562.71340.87800.44470.14012.8634
BPNN0.86970.41560.12512.76980.87070.44250.14172.7814
RFECVMLR0.85270.43020.13082.60560.89980.40320.12713.1584
RF0.93660.28220.08583.97240.92100.35790.11283.5575
SVM0.84930.43520.13232.57560.88120.43890.13832.9013
BPNN0.87260.41080.12362.80220.87010.44360.14212.7746
Note: the best results are in bold.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, W.; Wang, J.; Zhang, Y.; Yin, Q.; Wang, W.; Zhou, G.; Huo, Z. Combining Texture, Color, and Vegetation Index from Unmanned Aerial Vehicle Multispectral Images to Estimate Winter Wheat Leaf Area Index during the Vegetative Growth Stage. Remote Sens. 2023, 15, 5715. https://doi.org/10.3390/rs15245715

AMA Style

Li W, Wang J, Zhang Y, Yin Q, Wang W, Zhou G, Huo Z. Combining Texture, Color, and Vegetation Index from Unmanned Aerial Vehicle Multispectral Images to Estimate Winter Wheat Leaf Area Index during the Vegetative Growth Stage. Remote Sensing. 2023; 15(24):5715. https://doi.org/10.3390/rs15245715

Chicago/Turabian Style

Li, Weilong, Jianjun Wang, Yuting Zhang, Quan Yin, Weiling Wang, Guisheng Zhou, and Zhongyang Huo. 2023. "Combining Texture, Color, and Vegetation Index from Unmanned Aerial Vehicle Multispectral Images to Estimate Winter Wheat Leaf Area Index during the Vegetative Growth Stage" Remote Sensing 15, no. 24: 5715. https://doi.org/10.3390/rs15245715

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop