Next Article in Journal
Co-Application of Coated Phosphate Fertilizer and Humic Acid for Wheat Production and Soil Nutrient Transport
Previous Article in Journal
Suitable Water–Fertilizer Management and Ozone Synergy Can Enhance Substrate-Based Lettuce Yield and Water–Fertilizer Use Efficiency
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Estimating Potato Fraction Vegetation Coverage (FVC) Based on the Vegetation Index Intersection Method

1
College of Optical, Mechanical and Electrical Engineering, Zhejiang A&F University, Hangzhou 311300, China
2
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China
3
Institute of Technology, 165 Koen-cho, Kitami-shi 090-8507, Hokkaido, Japan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2024, 14(8), 1620; https://doi.org/10.3390/agronomy14081620
Submission received: 30 June 2024 / Revised: 18 July 2024 / Accepted: 22 July 2024 / Published: 24 July 2024
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
The acquisition of vegetation coverage information is crucial for crop field management, and utilizing visible light spectrum vegetation indices to extract vegetation coverage information is a commonly used method. However, most visible light spectrum vegetation indices do not fully consider the relationships between the red, green, and blue bands during their construction, making it difficult to ensure the accurate extraction of coverage information throughout the crop’s entire growth cycle. To rapidly and accurately obtain potato vegetation coverage information, drones were used in this study to obtain high-resolution digital orthoimages of potato growth stages. Based on the differences in the grayscale values of potato plants, soil, shadows, and drip irrigation belts, this study presents a combination index of blue and green bands (BGCI) and a combination index of red and green bands (RGCI). The vegetation index intersection method was used with 10 vegetation information indices to extract vegetation coverage, and the differences in extraction accuracy were compared with those of the maximum entropy method and bimodal histogram method. Based on the high-precision fraction vegetation coverage (FVC) extraction results, the Pearson correlation coefficient method and random forest feature selection were used to screen 10 vegetation and 24 texture features, and the top six vegetation indices most strongly correlated with the FVC were selected for potato growth stage FVC estimation and accuracy verification. A high-precision potato vegetation coverage estimation model was successfully established. This study revealed that during the potato tuber formation and expansion stages, the BGCI combined with the vegetation index intersection method achieved the highest vegetation coverage extraction accuracy, with overall accuracies of 99.61% and 98.84%, respectively. The RGCI combined with the vegetation index intersection method achieved the highest accuracy, 98.63%, during the maturation stage. For the potato vegetation coverage estimation models, the model based on the BGCI achieved the highest estimation accuracy (R2 = 0.9116, RMSE = 5.7903), and the RGCI also achieved good accuracy in terms of vegetation coverage estimation (R2 = 0.8987, RMSE = 5.8633). In the generality verification of the models, the R2 values of the FVC estimation models based on the BGCI and RGCI were both greater than 0.94. A potato vegetation coverage estimation model was constructed based on two new vegetation information indices, demonstrating good accuracy and universality.

1. Introduction

Fraction vegetation coverage (FVC) refers to the ratio of the vertical projection of the canopy area to the ground surface area per unit area. FVC is expressed as a fraction or percentage of the reference area and is influenced by field management practices such as nitrogen application and irrigation. FVC is also closely related to crop yield [1], and to crop physiological activities such as photosynthesis and transpiration and can reflect the growth status of crops [2]. For tuber crops like potatoes, monitoring FVC can provide timely insights into the growth of underground tubers and the nutritional status of the crops [3] and provide data support for irrigation and fertilizer application. Monitoring crop vegetation cover is highly important for crop management [4]. To date, high-resolution remote sensing images have been widely used in many fields to obtain spatial information, providing a more accurate information source for vegetation monitoring [5]. Remote sensing images facilitate the extraction and monitoring of vegetation information at the field scale.
Common remote sensing technologies used for monitoring crop growth status include proximal remote sensing, aerial remote sensing, and satellite remote sensing. Proximal remote sensing is not suitable for field-scale crop growth monitoring due to limitations in measurement speed [6]. Satellite remote sensing can obtain large-scale crop field information and is therefore often used in agriculture. Ma et al. used Sentinel-2 remote sensing data to invert the leaf area index of wheat and adopted the SP-UCI optimization algorithm to establish a wheat yield estimation model [7]. Worrall et al. combined satellite remote sensing data with a domain-oriented neural network to estimate the growth stages of maize [8]. Shamsuzzoha et al. utilized Landsat-8 image datasets to determine rice RGVI changes caused by cyclones and established machine learning models for predicting major land use phenological changes in rice crops [9]. Although satellite remote sensing can process large-area crop data, its practical application in crop phenotypic information like vegetation coverage is limited due to long measurement cycles, low resolution of captured data, and complex data processing.
Drone remote sensing is widely applied in crop growth monitoring due to its high image resolution and flexible advantages. Drone remote sensing technology can be categorized into hyperspectral remote sensing, multispectral remote sensing, and visible light remote sensing. Hyperspectral and multispectral remote sensing can acquire more detailed crop information compared to visible light remote sensing [10]. However, these technologies come with higher costs and more complex data processing requirements [11]. Although drone visible light remote sensing images contain less information, which limits their application in crop growth monitoring, visible light cameras are more cost-effective than hyperspectral and multispectral cameras, and most mainstream drones are equipped with visible light cameras [12]. Therefore, exploring methods to monitor crop growth using drone visible light remote sensing data is of significant importance for reducing agricultural production costs and promoting the use of drones in the agricultural sector.
Estimating crop FVC based on UAV remote sensing technology commonly uses machine learning and threshold-based methods [13]. Although machine learning methods provide higher accuracy in estimating crop FVC, they often require a large number of samples for training to improve the accuracy in large-scale crop coverage estimation. This makes sample selection inefficient and highly susceptible to the influence of human sample selection [14]. Threshold-based methods are characterized by their simplicity, efficiency, and accuracy; however, the effectiveness of threshold extraction greatly affects the estimation accuracy of crop FVC, making the precision of threshold extraction crucial for the threshold-based estimation of FVC [15]. The Otsu method is a commonly used technique for determining thresholds and has been widely applied in crop classification, but it can result in under-segmentation in certain situations [16]. Additionally, the resolution of UAV multispectral remote sensing images is relatively low, making the Otsu method unsuitable. The bimodal histogram method and maximum entropy threshold method are commonly used for estimating crop FVC [17]. For remote sensing images containing only crops and soil background, there is usually a single peak distribution, whereas for complex images with more than two peak distributions, the bimodal histogram method is not suitable. Noise in the images can affect the maximum entropy value, leading to lower accuracy in FVC estimation [18]. Therefore, selecting a threshold extraction method that is stable, accurate, and easy to operate is essential for estimating crop FVC.
The Gaussian mixture model (GMM) threshold method can effectively address the issues present in the aforementioned threshold methods [19]. This method assumes that the target object and background follow a GMM distribution in a color feature and uses the intersection of the GMM as the classification threshold [20]. The GMM threshold method can achieve ideal crop FVC estimation accuracy to some extent. However, using color features for threshold determination often results in low FVC estimation accuracy when crop plants are small or during the near-closure period. Additionally, the GMM threshold method requires appropriate samples to determine the classification threshold, and relying solely on manually selected samples reduces the efficiency of the algorithm [21]. Combining machine learning with the GMM threshold method can achieve rapid FVC estimation and better accuracy [22]. The relationship between the number of samples selected by machine learning and the balance between FVC estimation accuracy and efficiency has not yet been explored, and whether the texture features of crop plants and the background conform to a GMM distribution remains unknown. Therefore, further research on the GMM threshold method is needed to propose a stable, efficient, and high-accuracy crop FVC estimation method.
The main research objectives are as follows: (1) to construct a new vegetation index capable of achieving high-precision vegetation cover extraction for potatoes throughout their entire growth period by combining machine learning and threshold extraction methods to determine its classification threshold; (2) to select vegetation indices and texture features that have a high correlation with FVC and achieve high-precision extraction of potato vegetation cover based on the classification threshold; (3) to establish a vegetation cover estimation model for the entire growth period of potatoes.

2. Materials and Methods

2.1. Experimental Design

To increase sample diversity and improve the reliability of the FVC estimation model, two experimental areas were designed for field experiments. Experiment Area A is located in the Yangling Demonstration Zone, Shaanxi Province (latitude 34°18′10″ N, longitude 108°5′14.33″ E). The soil pH, organic matter, and nitrogen concentration in this experimental area were 7.2, 27.31 g/kg, and 1.96 g/kg, respectively. Potatoes of the Jinxu 16 variety were planted in May 2021, with planting spacing, row spacing, and planting depths of 0.6 m, 0.5 m, and 10 cm, respectively, and the potatoes were harvested in September 2021. Five nitrogen levels were set (N1: 0 kg/ha; N2: 75 kg/ha; N3: 150 kg/ha; N4: 225 kg/ha; and N5: 300 kg/ha). Each treatment was repeated five times, totaling 25 plots, each measuring 7.5 m in length and width. Five ground control points were systematically set around the experimental area for the georeferencing of multiperiod images and the preprocessing of stitched images.
Experiment Area B is located in Ningtiaoliang town, Jingbian County, Yulin city, Shaanxi Province (latitude 37°33′55.60″ N, longitude 108°22′7.46″ E). The soil pH, organic matter, and nitrogen concentration in this experimental area were 7.7, 16.35 g/kg, and 0.89 g/kg, respectively. Three early maturing potato varieties (Hisen 6, 226, and V7) were used in this experiment, with five nitrogen levels (N1: 0 kg/ha; N2: 60 kg/ha; N3: 120 kg/ha; N4: 180 kg/ha; and N5: 240 kg/ha). Each treatment was replicated three times, totaling 15 plots, each measuring 112 m2 (14 m × 8 m). The remaining treatments were the same as those in Experiment Area A. Potatoes were planted in May 2022 and harvested in August 2022, with a total growth cycle of approximately 90 days. The information about Experiment Area B is shown in Figure 1.

2.2. Visible Light Image Acquisition and Preprocessing

Considering the impact of sensors on estimating potato phenotypic information and nitrogen nutrition diagnosis and to further explore the applicability of the model, Experiment Area A utilized a DJI Phantom 4 RTK drone (SZ DJI Technology Co., Ltd., Shenzhen, China) to capture visible light images during the tuber formation and tuber enlargement stages of potatoes. The drone has an aperture range from f/2.8 to f/11, an ISO range of 100–3200 (automatic) and 100–12,800 (manual), a maximum photo resolution of 5472 × 3648, a flight time of 30 min, and a positioning accuracy (Real-Time Kinematic positioning mode) of 1.5 cm + 1 ppm vertically and 1 cm + 1 ppm horizontally. DJI GS RTK software was used for drone flight route planning, with the flight altitude set at 30 m and the flight speed set at 3 m/s. The drone captured images vertically downwards with an overlap of 85% in both heading and sideways. All flights were conducted at noon under clear, windless conditions. During the flights, five ground control points were visible in the images, and their GCP values were measured using a differential global positioning system (DGPS) to further correct the image positioning information. The ground resolution of the images is 0.95 cm/pixel.
Experiment B utilized the DJI Phantom 4 Pro drone to capture visible light images of potato canopies during the tuber formation, tuber enlargement, and tuber maturity stages. The drone has a shutter speed range of 1/2000–1/8000 s, with other parameters identical to those of the DJI Phantom 4 RTK drone. Atrtizure software was used for drone flight path planning, with flight parameters identical to those used in Experiment A.
Preprocessing of remote sensing images is an indispensable process for monitoring crop growth and nutrient diagnosis [23]. Common preprocessing methods for UAV remote sensing technology include image stitching, image registration, and cropping [24]. This study utilized Pix4D mapper software developed by the Swiss company Pix4D AG to stitch all visible light images. The main workflow was as follows: (i) importing images; (ii) importing control point coordinates for point cloud generation; (iii) applying a one-click automatic process for point cloud extraction and 3D model generation; and (iv) generating point cloud data, the DSM, and the DOM [25]. ENVI software Version 5.3 (Exelis Visual Information Solutions, Boulder, CO, USA) was used for image registration of visible light, multispectral orthoimages, and elevation images. The corresponding experimental field areas for the entire growth period of potatoes were cropped using ENVI software.

2.3. Selection and Construction of Remote Sensing Feature Elements

Experiment B included more potato data covering various growth stages and a richer variety of potato species. The potential impact of variety on the accuracy of vegetation cover extraction was considered. This study focused on researching vegetation cover estimation methods using Experiment B and validated them using Experiment A. To reduce the influence of noise on data processing, convolutional low-pass filtering was applied to denoise the visible light images of potatoes captured by drones throughout their growth period. The convolution kernel size was set to 5, with a weighted return value of 0. After denoising, vegetation index calculations and texture feature extraction were performed.

2.3.1. Selection and Construction of Visible Light Band Vegetation Indices

Previous studies have constructed numerous vegetation indices based on visible light images to assess crop vegetation cover. This study selected eight vegetation indices, namely, the GRVI, EXG, RGBVI, MGRVI, NGRVI, NGBDI, GLI, and TRVI [26,27,28,29]. However, the construction process of these common vegetation indices in the visible light band did not adequately consider the interrelationships among the three bands. Changes in individual bands often had a significant impact on the constructed vegetation indices. Therefore, the construction of new vegetation indices using multiband combinations can significantly improve the effectiveness of cover detection.
The visible light remote sensing images of the tuber enlargement stage of potatoes in Experiment B were visually interpreted. The experimental field encompassed various land cover types, contributing to a complex field environment. To effectively distinguish between these land cover types, 100 regions of interest (ROIs) were selected for each type using ENVI software. Subsequently, the characteristic values of the blue, green, and red bands for each land cover type were calculated and summarized, as shown in Table 1.
As shown in Table 1, there is an overlap between the grayscale values of potato plants in the blue band and shadows and between the grayscale values in the green band and those in the drip irrigation tape band. Additionally, there is a partial overlap in the grayscale values in the blue band with both shadows and drip irrigation tape. It is difficult to distinguish potato plants using a single band. Scatterplots of red-green-blue for potato plants, soil, shadows, and drip irrigation tape (Figure 2) were generated. Figure 2 clearly shows that there are distinct boundaries between potato plants and other land cover categories under the red-green and blue-green combinations. Potato plants are mainly concentrated in the lower-left region of the scatterplots, while other land cover categories are predominantly located in the upper-right region. However, the blue-red combination was unable to identify potato plants within the shadows.
A total of 25 points were selected from the scatterplot boundary lines between plants and shadows and between soil in Figure 2a,b for linear fitting. The fitting results are shown in Figure 3a,b.
The fitted boundary functions were then used to construct new combination indices: the combination index of blue and green bands (BGCI) and the combination index of red and green bands (RGCI). The corresponding formulas are shown as Equation (1) and Equation (2), respectively, as follows:
B G C I = 2 G 1.0299 B 19.308 R
R G C I = 2 G 1.1555 R 25.007 B

2.3.2. Texture Feature Extraction

Texture is a visual characteristic that reflects the homogeneous appearance of crops, embodying the structural arrangement attributes of periodic changes on the surface of objects in the image [30]. Currently, texture features have been widely applied in crop classification and prediction [31]. The gray level co-occurrence matrix (GLCM) is a common method for calculating image texture features, mainly by extracting texture features through the calculation of conditional density probability functions of the grayscale levels of objects in the image [18]. In this study, the ENVI software was used to calculate eight common texture features, namely, the mean, variance, homogeneity, contrast, dissimilarity, entropy, angular second moment, and correlation, within a 7 × 7 window.

2.4. Fraction Vegetation Coverage Extraction Method

2.4.1. Determination of Extraction Thresholds

The vegetation index intersection method can effectively achieve vegetation coverage extraction. However, the common applications and vegetation indices of the vegetation index intersection method are currently designed for specific environments. Further investigations are needed to determine the vegetation coverage of potatoes throughout their entire growth period [32]. In this study, the threshold determination process based on the vegetation index intersection method was divided into two main parts; one part involved clipping the entire experimental potato field and performing supervised classification on the clipped area. The other part combined the supervised classification results with the Gaussian mixture model to further determine the extraction threshold for potato vegetation coverage. This study presents solved thresholds for the constructed RGCI, BGCI, and eight other common vegetation indices solved using the vegetation index intersection method.
The support vector machine (SVM) is currently widely used in remote sensing image classification and effectively addresses issues such as small samples, nonlinearity, and high dimensionality. It also has strong generalization capabilities, with the type of kernel function being the radial basis function. To efficiently process sample data, the SVM was used for supervised classification of the cropped images. The cropping range of the region was set to 7 m × 7 m. The specific process for supervised classification using the SVM was as follows:
(1)
The orthophoto images of the potato field were cropped and 40 regions of interest (ROIs) of potato plants and the background on the orthophoto images were selected. The separability of the selected samples was calculated.
(2)
Based on the separability of the samples from the three growth stages of potatoes, the reasonableness of the selected samples was assessed.
(3)
Additionally, 30 ROIs of potato plants and the background were selected to verify the SVM classification results using a confusion matrix.

2.4.2. Validation of Vegetation Coverage Extraction Accuracy

Using the determined thresholds, the vegetation coverage of potatoes was extracted for each growth stage. In this study, 300 regions of interest (ROIs) of potato plants and the background were selected outside the cropped areas of the potato experimental field using visual interpretation. A confusion matrix was used to validate the vegetation coverage extracted by the vegetation index intersection method for the three potato growth stages. The validation was evaluated using the kappa coefficient and overall accuracy.

2.5. Establishment of the Fraction Vegetation Coverage Estimation Model

2.5.1. Vegetation Index and Texture Feature Selection

Various potential covariates can be used for vegetation coverage estimation, but applying all available covariates can reduce the data processing efficiency. Therefore, it is necessary to filter out important features before establishing the estimation model.
Based on the best potato vegetation coverage results from Experiment B, three sampling areas were selected from each of the fifteen experimental plots to measure vegetation coverage during the three growth stages. The visible light image consists of three bands, red, green, and blue, each containing eight texture features. To facilitate presentation, the 24 texture features are shown in Table 2. The mean values of the corresponding 10 vegetation indices and 24 texture features were calculated for each sampling area.
To improve the computational efficiency of the model, this study utilized random forest and Pearson correlation coefficient methods to select 10 vegetation indices and 24 texture features and obtained important remote sensing factors. Random forest (RF) is an algorithm that measures feature importance by randomly replacing each feature [33]. When a feature is highly important, the prediction error rate of the RF model will increase. The change in the error rate of the out-of-bag data before and after feature replacement is applied to evaluate each feature, and their importance scores are obtained. This study used the Python 3.6 library (Scikit Learn package) to implement this algorithm and selected, repeated tenfold cross-validation to optimize the accuracy of the RF model. Subsets with minimal impact on the random forest model were used as subsets of predictors.
The Pearson correlation coefficient was used to detect the linear correlation between continuous variables, with values ranging from −1 to 1; positive and negative values indicate positive and negative correlations, respectively. The larger the absolute value is, the greater the linear correlation.

2.5.2. Establishment and Validation of the Fraction Vegetation Coverage Estimation Model

This study used RF and Pearson correlation coefficients to assess the importance of vegetation indices and texture features. Based on the top six features selected by these algorithms, a model for estimating the FVC was constructed. For each potato growth stage, a total of 45 vegetation coverage extraction results were obtained. Thirty randomly selected data points were used to construct the vegetation coverage estimation model, while the remaining fifteen data points were used for model validation. Linear fitting was used to establish a vegetation coverage estimation model for potato plants and validate its performance.
Common model evaluation metrics include the coefficient of determination (R2) and root mean square error (RMSE) [34]. A higher R2 value and a smaller RMSE value indicate higher model accuracy. In this study, the evaluation metrics R2 and RMSE were used to validate the performance of the models. The formulas for calculating R2 and RMSE are shown in Equations (3) and (4), respectively, as follows:
R 2 = 1 i = 1 n x i y i 2 i = 1 n x i x ˙ ¯ 2
R M S E = i = 1 n x i y i 2 n
In the equations, x i represents the measured values; y i represents the predicted values; x ˙ ¯ represents the mean of the measured values; and n represents the sample size.

3. Results

3.1. Extraction of FVC Thresholds Based on the Vegetation Index Intersection Method

To determine the effectiveness of the SVM in classifying plants and the background in images, the classification performance of the SVM was validated using a confusion matrix, as shown in Figure 4. The overall classification accuracy for the experimental field during the tuber formation stage was 99.922%, with a kappa coefficient of 0.9982. During the tuber enlargement stage, the overall classification accuracy was 99.9871%, with a kappa coefficient of 0.9997. For the maturity stage, the overall classification accuracy was 99.8899%, with a kappa coefficient of 0.9977. The validation results from the confusion matrix indicate that the SVM achieved high classification accuracy in the small-potato regions across all three stages. The overall classification accuracy for the three growth stages of potatoes was greater than 99%, and the kappa coefficient was greater than 0.99. Therefore, this method can be used to determine the FVC extraction threshold for potatoes.
The extraction thresholds for various vegetation indices were calculated based on the vegetation index intersection method. Since the newly constructed vegetation indices, RGCI and BGCI, are similar in principle to the EXG vegetation index, this study focused on comparing the threshold extraction results of these three indices. As shown in Figure 5, Figure 6 and Figure 7, the EXG histograms exhibit irregularities during the three growth stages of the potatoes. In particular, in the tuber formation stage, multiple intersections in the EXG histogram prevent the effective determination of the extraction threshold for the potato vegetation coverage. The newly constructed RGCI and BGCI vegetation indices effectively address the irregularity issues of the EXG index and can better determine the classification thresholds for the background and potato plants. The coverage extraction thresholds of the other selected vegetation indices can also be determined via the vegetation index intersection method. The extraction threshold results for each vegetation index corresponding to the three growth stages of potatoes are shown in Table 3.

3.2. Extraction Results of Potato FVC

The determined thresholds were applied to extract the vegetation coverage of the entire experimental field for the corresponding three growth stages of potatoes. Pixels with values greater than the threshold were classified as potato plant pixels, while those with values less than the threshold were classified as background pixels. Further calculations were performed using Formula 5 to calculate the vegetation coverage of the potatoes. The extraction results of potato vegetation coverage for the three stages are shown in Figure 8.
M F V C = N p o t a t o N p o t a t o + N s o i l
In the formula, M F V C , N p o t a t o , and N s o i l represent potato vegetation coverage, the number of potato plant pixels, and the number of soil pixels, respectively.
Figure 9 shows the accuracy verification results of potato vegetation coverage extraction based on the vegetation index intersection method. During the three growth stages of the potatoes, except for the GRVI, which had lower accuracy in extracting vegetation coverage during the tuber formation stage with an overall accuracy below 90% and a kappa coefficient below 0.5, the remaining vegetation indices achieved higher accuracy in estimating vegetation coverage.
During the tuber formation stage, the BGCI achieved the highest accuracy in terms of vegetation coverage extraction among all the vegetation indices, with an overall accuracy of 99.6079% and a kappa coefficient of 0.9898. Similarly, during the tuber enlargement stage, the BGCI achieved the highest accuracy in terms of vegetation coverage extraction, with an overall accuracy and kappa coefficient of 98.8405% and 0.9753, respectively. In the potato maturity stage, the RGCI achieved the highest accuracy in estimating vegetation coverage, with an overall accuracy and kappa coefficient of 98.6336% and 0.9712, respectively.
Through comparison, the RGCI and BGCI demonstrated better accuracy in extracting potato vegetation coverage based on the vegetation index intersection method than did the other vegetation indices. Comparing the vegetation coverage extraction results of the first two stages, the overall accuracy of the BGCI was higher than that of the RGCI by 0.1741% and 0.4866%, respectively. However, in the potato maturity stage, the overall classification accuracy of the BGCI was lower than that of the RGCI by 0.7719%.

3.3. Potato Vegetation Coverage Estimation Model

To ensure the effective estimation accuracy of the potato FVC estimation model throughout the entire growth period, this study selected the BGCI combined with the vegetation index intersection method to extract vegetation coverage during the tuber formation and tuber enlargement stages of the potatoes. The RGCI combined with the vegetation index intersection method was chosen to extract vegetation coverage during the maturity stage. The extraction results are shown in Figure 10.
During the feature selection process for individual potato growth stages, the limited amount of data from each stage may lead to nonrepresentative feature selection results. Therefore, this study conducted feature selection for the entire growth period of the potato plants. As shown in Figure 11, the Pearson correlation coefficient method over the entire potato growth period revealed that the BGCI and RGBVI exhibited the highest correlation with FVC (R = 0.92), followed by the EXG and RGCI, with correlation coefficients of 0.91 for both indices. Next in line were the NGRVI (R = 0.90) and NGBDI (R = 0.89).
Among the 24 texture features, B-correlation was the most highly correlated feature with FVC (R = 0.73), but its correlation coefficient was still less than 0.75. Therefore, the results from the Pearson correlation coefficient method indicated that texture features are not suitable for establishing the FVC estimation model for the tuber formation stage of potatoes.
The results of the random forest feature selection also showed that the BGCI obtained the highest score, with an importance of 0.1969. Next in line were the NGBDI, RGCI, RGBVI, NGRVI, and EXG, with importances of 0.1241, 0.0894, 0.0554, 0.0483, and 0.0332, respectively. These scores accounted for less than 0.1 of the total importance. Although the feature importance ranking from random forest selection was not entirely consistent with that of the Pearson correlation coefficient method, both methods identified the MGRVI, BGCI, TRVI, NGRVI, RGCI, and NGRVI as the six remote sensing feature factors with relatively high correlations to potato FVC. These selected vegetation indices were used to estimate potato vegetation coverage during the tuber formation stage.
Using the six selected vegetation indices, the estimation of the vegetation coverage during the potato tuber formation period was carried out, and the estimation results were analyzed and evaluated. Evaluating the accuracy of FVC estimation of the model using R2 and RMSE values, it was observed that with the increase in vegetation coverage data volume throughout the potato growth period, the BGCI and RGCI achieved better accuracy in FVC estimation. As shown in Figure 12, the BGCI obtained the highest FVC estimation accuracy during the entire potato growth period (R2 = 0.9116, RMSE = 5.7903), followed by the EXG (R2 = 0.9065, RMSE = 5.8669) and RGCI (R2 = 0.8987, RMSE = 5.8633). The NGRVI had the lowest FVC estimation accuracy (R2 = 0.7175, RMSE = 9.7841).

4. Discussion

4.1. Comparison of Different Threshold Extraction Methods

Common methods for estimating vegetation coverage include machine learning and thresholding methods [35]. Threshold-based methods are known for their simplicity, efficiency, and accuracy and have been successfully applied in crop vegetation coverage extraction studies based on UAV remote sensing images [36]. In crop vegetation coverage extraction, the maximum entropy thresholding method and the bimodal histogram method are two widely used methods for FVC extraction. To better elucidate the effectiveness of the two vegetation indices constructed in this study under different methods, 10 vegetation indices were combined with the maximum entropy thresholding method and the bimodal histogram method to extract potato vegetation coverage during three growth periods. By comparing the FVC extraction accuracy of the 10 vegetation indices under the vegetation index intersection method, the maximum entropy thresholding method, and the bimodal histogram method, the most suitable vegetation index and extraction method for vegetation coverage extraction were explored. To compare the FVC extraction accuracy of the 10 vegetation indices under the 3 methods more intuitively, the data are visualized, as shown in Figure 13.
As shown in Figure 13a, during the potato tuber formation stage, the BGCI combined with the vegetation index intersection method achieves the best vegetation coverage accuracy. Compared with the maximum entropy thresholding method and the bimodal histogram method, the RGCI combined with the vegetation index intersection method also achieves greater vegetation coverage accuracy. Figure 13b shows that during the potato tuber expansion stage, the BGCI combined with the vegetation index intersection method achieves the best vegetation coverage extraction accuracy. As shown in Figure 13c, during the potato maturation stage, the RGCI combined with the vegetation index intersection method achieves the highest vegetation coverage estimation accuracy. However, the BGCI combined with the vegetation index intersection method did not achieve the best vegetation coverage extraction accuracy during this growth stage, as it did in the previous two growth stages.
Based on the analysis results of the 10 vegetation indices combined with the 3 vegetation coverage extraction methods, it can be concluded that during the potato tuber formation and expansion stages, the BGCI combined with the vegetation index intersection method achieved the best vegetation coverage extraction accuracy. During the potato maturation stage, the RGCI combined with the vegetation index intersection method achieved the highest vegetation coverage extraction accuracy.

4.2. Validation of the General Applicability of the Vegetation Index Intersection Method

Through the potato vegetation coverage extraction results of Experiment B, it is evident that the vegetation index combined with the vegetation index intersection method achieved the highest accuracy. To further explore whether this method has transferability, data from Experiment A were selected to validate the general applicability of the vegetation index intersection method.
As shown in Figure 14, after irrigating the potatoes, some of the soil on the surface of the fields in Experiment A had not dried and appeared darker in color. This caused the grayscale images of the RGBVI, NGRVI, MGBDI, NGBDI, MGRVI, and GRVI to show less distinct differences between the soil and potato vegetation. The brighter portions of the soil had higher grayscale values, similar to those of the potato plants. This phenomenon can lead to lower vegetation coverage extraction accuracy [37].
In the grayscale images of the RGCI, BGCI, EXG, TRVI, and GLI, the potato plants had higher grayscale values and appeared white, while the soil areas had lower grayscale values and appeared black (Figure 15); this allowed for a clear distinction between the plants and the soil. However, the EXG histogram exhibited irregularities (Figure 16), making it difficult to determine the vegetation coverage extraction threshold using the vegetation index intersection method. Therefore, the RGCI, BGCI, TRVI, and GLI were selected to extract potato vegetation coverage using the vegetation index intersection method.
Using visual interpretation, 300 regions of interest of potato plants and soil were selected on the orthoimages of the experimental field to verify the potato vegetation coverage extracted using the vegetation index intersection method with a confusion matrix. As shown in Table 4, during the first two growth stages of the potatoes, the BGCI achieved the highest vegetation coverage extraction accuracy. In the tuber formation stage, the overall classification accuracy and kappa coefficient were 99.97% and 0.9974, respectively. In the tuber expansion stage, the overall accuracy and kappa coefficient were 99.81% and 0.9999, respectively. The overall classification accuracy and kappa coefficient in the tuber formation stage were 99.77% and 0.9952, respectively, and those in the tuber expansion stage were 99.77% and 0.9952, respectively; this is consistent with the conclusions drawn from Experiment B, demonstrating that the two vegetation indices proposed in this study combined with the vegetation index intersection method achieved an ideal accuracy and general applicability for vegetation coverage extraction.

4.3. The Effect of Fitting Parameter Selection on Estimation Accuracy

Experiment A and Experiment B involved different potato varieties, soil types, and growth environments. To validate the generality of the potato FVC estimation model developed in Experiment B, FVC data extracted using the BGCI combined with the vegetation index threshold method from Experiment A were used as independent data to validate the estimation model established in Experiment B. Figure 17 shows the accuracy validation results of the potato FVC model based on the BGCI and RGCI.
The results indicate that the potato FVC models based on the BGCI and RGCI achieved an ideal potato FVC estimation accuracy, with the R2 values of both the RGCI and BGCI models exceeding 0.94 and RMSE values below 2.30. Compared to Experiment B, both the RGCI and BGCI models showed greater accuracy in estimating the FVC for Experiment A. The main factor contributing to the difference in potato FVC estimation accuracy was the presence of three potato varieties in Experiment B, making the field background more complex and leading to a decrease in the accuracy of potato FVC estimation. Both the BGCI models in Experiment A and Experiment B achieved R2 values greater than 0.90, while the RGCI model’s FVC estimation accuracy exceeded an R2 value of 0.89. The validation results demonstrate that the FVC estimation models constructed using the BGCI and RGCI proposed in this study have high accuracy and strong generality. These models effectively improve the accuracy of FVC estimation using traditional parameters based on UAV visible light images and perform well in different planting environments and growth stages.
This study further compares the results with research conducted by others to understand the performance of the established model. Xie et al. utilized a drone equipped with an RGB camera to obtain potato field images. Using the Otsu method and the vegetation index EXG to highlight canopy pixels in the images for better segmentation, the coverage estimation model achieved an R2 of 0.99 [38]. The estimation accuracy of the model established in this study is slightly lower than that of Xie et al.’s model. The main reason is that this study’s model was built using multi-period data with a long data acquisition cycle, and the complex field environment had a significant impact on the model’s development. In contrast, Xie et al. built the model using only single-period potato RGB images and validated the model with data from the same period. Although Xie et al.’s model achieves higher predictive accuracy with sufficient sample sizes, its prediction effectiveness for potato coverage across long periods and different field environments is poorer. Njane et al. estimated potato vegetation coverage using a multispectral camera, achieving an estimation model with an R2 of 0.992 and an RMSE of 0.037 [39], which is higher than the model established in this study. This study used visible light remote-sensing images, which contain less crop information compared to multispectral images, resulting in lower model estimation accuracy compared to multispectral data. However, the model established in this study still achieved an R2 of over 0.9 and maintained the same prediction accuracy in other field environments. This demonstrates that combining the newly established vegetation indices BGCI and RGCI with the intersection method of vegetation indices effectively compensates for the limitations of visible light images in estimating potato FVC.

5. Conclusions

This study presents an analysis of the differences in grayscale values between potato plants and the background in the visible light spectrum in the field via the construction of RGCI and BGCI vegetation indices based on the relationships in the red-green and blue-green bands. The computation processes of eight common visible light spectrum vegetation indices and twenty-four texture features were described. The potato FVC was extracted using the vegetation index intersection method, maximum entropy thresholding method, and bimodal histogram method. The accuracy of 10 vegetation indices combined with three FVC extraction methods was validated using a confusion matrix, and the best vegetation index and potato FVC extraction method were further selected. The Pearson correlation coefficient method and random forest feature selection were used to screen the aforementioned vegetation indices and 24 texture features. Based on the top six selected features, potato FVC estimation models were established. The main conclusions are as follows:
(1) This study newly constructed two vegetation indices, BGCI and RGCI, and successfully obtained classification thresholds using the SVM combined with vegetation indices, which can effectively differentiate between the background and potato plants. The extraction thresholds for the BGCI during the three growth periods were −13.0583, 10.1801, and −4.3000, respectively. For the RGCI, the extraction thresholds during the three growth periods were 2.5892, 23.0584, and 16.9357, respectively. The BGCI and RGCI could effectively distinguish potato plants from the background under the above thresholds.
(2) The BGCI and RGCI combined with the vegetation index intersection method both achieved excellent results in the extraction of potato vegetation coverage throughout the entire growth period. During the potato tuber formation and expansion stages, the BGCI combined with the vegetation index intersection method achieved the highest vegetation coverage extraction accuracy, with overall accuracies of 99.61% and 98.84%, respectively. The RGCI combined with the vegetation index intersection method achieved the highest accuracy, 98.63%, in terms of vegetation coverage extraction during the maturation stage. Overall, the RGCI combined with the vegetation index intersection method obtained the most ideal vegetation coverage extraction results throughout the entire potato growth period.
(3) This study screened multiple vegetation indices and texture features and successfully established a highly accurate potato vegetation coverage estimation model. Using the Pearson correlation coefficient method and random forest feature selection, six vegetation indices highly correlated with potato FVC (BGCI, NGBDI, RGCI, RGBVI, NGRVI, and EXG) were selected, and corresponding vegetation coverage estimation models were constructed. Among these models, the vegetation coverage estimation model based on the BGCI exhibited the highest accuracy (R2 = 0.9116, RMSE = 5.7903). In the validation of model generality, the FVC estimation models based on the newly constructed BGCI and RGCI also achieved good accuracy. The RGCI had an R2 value of 0.9497 and an RMSE of 2.2548, while BGCI had an R2 value of 0.9557 and an RMSE of 2.1486. This study proposed two novel vegetation information indices, and the potato vegetation coverage models based on these indices demonstrated good model accuracy and generality. The high accuracy of FVC estimation and the general applicability of the model are important for the real-time detection of potato vegetation coverage. Through high-precision coverage estimation at any time, the actual growth condition of potato can be understood in time, so that the input of potato water and fertilizers can be controlled in a timely and effective manner. This is important for guiding the field management of tuber crops to improve potato yields.

Author Contributions

Methodology, X.S. and H.Y.; formal analysis, X.S. and T.G.; data curation, T.G. and R.L.; writing—original draft X.S. and H.Y.; writing—review and editing, H.Y., R.L., L.Y. and Y.H.; visualization, Y.C.; investigation, Y.C.; supervision, L.Y. and Y.H.; project administration, Y.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China (32171894). This research was also supported by the China Scholarship Council, China (No. 202208330269).

Data Availability Statement

The data are contained within the article.

Acknowledgments

We would like to thank Chao Wang and Shaoxiang Wang for their work on field data collection.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Huang, X.; Cheng, F.; Wang, J.L.; Duan, P.; Wang, J.S. Forest Canopy Height Extraction Method Based on ICESat-2/ATLAS Data. IEEE Trans. Geosci. Remote Sens. 2023, 61, 14. [Google Scholar] [CrossRef]
  2. de la Casa, A.; Ovando, G.; Bressanini, L.; Martinez, J.; Diaz, G.; Miranda, C. Soybean crop coverage estimation from NDVI images with different spatial resolution to evaluate yield variability in a plot. ISPRS J. Photogramm. Remote Sens. 2018, 146, 531–547. [Google Scholar] [CrossRef]
  3. Zhang, D.; Mansaray, L.R.; Jin, H.; Sun, H.; Kuang, Z.; Huang, J. A universal estimation model of fractional vegetation cover for different crops based on time series digital photographs. Comput. Electron. Agric. 2018, 151, 93–103. [Google Scholar] [CrossRef]
  4. Teng, Y.J.; Ren, H.Z.; Zhu, J.S.; Jiang, C.C.; Ye, X.; Zeng, H. A practical method for angular normalization on land surface temperature using space between thermal radiance and fraction of vegetation cover. Remote Sens. Environ. 2023, 291, 20. [Google Scholar] [CrossRef]
  5. Neyns, R.; Canters, F. Mapping of Urban Vegetation with High-Resolution Remote Sensing: A Review. Remote Sens. 2022, 14, 1031. [Google Scholar] [CrossRef]
  6. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  7. Ma, C.; Liu, M.; Ding, F.; Li, C.; Cui, Y.; Chen, W.; Wang, Y. Wheat growth monitoring and yield estimation based on remote sensing data assimilation into the SAFY crop growth model. Sci. Rep. 2022, 12, 5473. [Google Scholar] [CrossRef]
  8. Worrall, G.; Judge, J.; Boote, K.; Rangarajan, A. In-season crop phenology using remote sensing and model-guided machine learning. Agron. J. 2023, 115, 1214–1236. [Google Scholar] [CrossRef]
  9. Shamsuzzoha, M.; Shaw, R.; Ahamed, T. Machine learning system to assess rice crop change detection from satellite-derived RGVI due to tropical cyclones using remote sensing dataset. Remote Sens. Appl. Soc. Environ. 2024, 35, 101201. [Google Scholar] [CrossRef]
  10. Feng, A.; Zhou, J.; Vories, E.D.; Sudduth, K.A.; Zhang, M. Yield estimation in cotton using UAV-based multi-sensor imagery. Biosyst. Eng. 2020, 193, 101–114. [Google Scholar] [CrossRef]
  11. Khan, A.; Vibhute, A.D.; Mali, S.; Patil, C.H. A systematic review on hyperspectral imaging technology with a machine and deep learning methodology for agricultural applications. Ecol. Inform. 2022, 69, 101678. [Google Scholar] [CrossRef]
  12. Yang, Q.; Shi, L.; Han, J.; Yu, J.; Huang, K. A near real-time deep learning approach for detecting rice phenology based on UAV images. Agric. For. Meteorol. 2020, 287, 107938. [Google Scholar] [CrossRef]
  13. Hu, J.Y.; Feng, H.; Wang, Q.L.; Shen, J.N.; Wang, J.; Liu, Y.; Feng, H.K.; Yang, H.; Guo, W.; Qiao, H.B.; et al. Pretrained Deep Learning Networks and Multispectral Imagery Enhance Maize LCC, FVC, and Maturity Estimation. Remote Sens. 2024, 16, 784. [Google Scholar] [CrossRef]
  14. Niu, Y.X.; Han, W.T.; Zhang, H.H.; Zhang, L.Y.; Chen, H.P. Estimating fractional vegetation cover of maize under water stress from UAV multispectral imagery using machine learning algorithms. Comput. Electron. Agric. 2021, 189, 11. [Google Scholar] [CrossRef]
  15. Wang, Z.; Chen, W.; Xing, J.H.; Zhang, X.P.; Tian, H.J.; Tang, H.Z.; Bi, P.S.; Li, G.C.; Zhang, F.J. Extracting vegetation information from high dynamic range images with shadows: A comparison between deep learning and threshold methods. Comput. Electron. Agric. 2023, 208, 12. [Google Scholar] [CrossRef]
  16. Yu, Y.; Bao, Y.D.; Wang, J.C.; Chu, H.J.; Zhao, N.; He, Y.; Liu, Y.F. Crop Row Segmentation and Detection in Paddy Fields Based on Treble-Classification Otsu and Double-Dimensional Clustering Method. Remote Sens. 2021, 13, 901. [Google Scholar] [CrossRef]
  17. Ding, F.; Li, C.C.; Zhai, W.G.; Fei, S.P.; Cheng, Q.; Chen, Z. Estimation of Nitrogen Content in Winter Wheat Based on Multi-Source Data Fusion and Machine Learning. Agriculture 2022, 12, 1752. [Google Scholar] [CrossRef]
  18. Chen, H.; Li, W.; Zhu, Y.Y. Improved window adaptive gray level co-occurrence matrix for extraction and analysis of texture characteristics of pulmonary nodules. Comput. Meth. Programs Biomed. 2021, 208, 6. [Google Scholar] [CrossRef]
  19. Jiao, L.M.; Denoeux, T.; Liu, Z.G.; Pan, Q. EGMM: An evidential version of the Gaussian mixture model for clustering. Appl. Soft. Comput. 2022, 129, 17. [Google Scholar] [CrossRef]
  20. Yu, Y.; Meng, L.; Luo, C.; Qi, B.; Zhang, X.; Liu, H. Early Mapping Method for Different Planting Types of Rice Based on Planet and Sentinel-2 Satellite Images. Agronomy 2024, 14, 137. [Google Scholar] [CrossRef]
  21. Yang, H.B.; Hu, Y.H.; Zheng, Z.Z.; Qiao, Y.C.; Hou, B.R.; Chen, J. A New Approach for Nitrogen Status Monitoring in Potato Plants by Combining RGB Images and SPAD Measurements. Remote Sens. 2022, 14, 4814. [Google Scholar] [CrossRef]
  22. Madec, S.; Irfan, K.; Velumani, K.; Baret, F.; David, E.; Daubige, G.; Samatan, L.B.; Serouart, M.; Smith, D.; James, C.; et al. VegAnn, Vegetation Annotation of multi-crop RGB images acquired under diverse conditions for segmentation. Sci. Data 2023, 10, 12. [Google Scholar] [CrossRef]
  23. Li, D.L.; Li, C.; Yao, Y.; Li, M.D.; Liu, L.C. Modern imaging techniques in plant nutrition analysis: A review. Comput. Electron. Agric. 2020, 174, 14. [Google Scholar] [CrossRef]
  24. Jiang, J.; Atkinson, P.M.; Chen, C.S.; Cao, Q.; Tian, Y.C.; Zhu, Y.; Liu, X.J.; Cao, W.X. Combining UAV and Sentinel-2 satellite multi-spectral images to diagnose crop growth and N status in winter wheat at the county scale. Field Crop. Res. 2023, 294, 13. [Google Scholar] [CrossRef]
  25. Yang, N.; Zhang, Z.T.; Zhang, J.R.; Guo, Y.H.; Yang, X.Z.; Yu, G.D.; Bai, X.Q.; Chen, J.Y.; Chen, Y.W.; Shi, L.S.; et al. Improving estimation of maize leaf area index by combining of UAV-based multispectral and thermal infrared data: The potential of new texture index. Comput. Electron. Agric. 2023, 214, 16. [Google Scholar] [CrossRef]
  26. Garofalo, S.P.; Giannico, V.; Costanza, L.; Alhajj Ali, S.; Camposeo, S.; Lopriore, G.; Pedrero Salcedo, F.; Vivaldi, G.A. Prediction of Stem Water Potential in Olive Orchards Using High-Resolution Planet Satellite Images and Machine Learning Techniques. Agronomy 2023, 14, 1. [Google Scholar] [CrossRef]
  27. Shi, H.Z.; Guo, J.J.; An, J.Q.; Tang, Z.J.; Wang, X.; Li, W.Y.; Zhao, X.; Jin, L.; Xiang, Y.Z.; Li, Z.J.; et al. Estimation of Chlorophyll Content in Soybean Crop at Different Growth Stages Based on Optimal Spectral Index. Agronomy 2023, 13, 663. [Google Scholar] [CrossRef]
  28. Yang, H.B.; Hu, Y.H.; Zheng, Z.Z.; Qiao, Y.C.; Zhang, K.L.; Guo, T.F.; Chen, J. Estimation of Potato Chlorophyll Content from UAV Multispectral Images with Stacking Ensemble Algorithm. Agronomy 2022, 12, 2318. [Google Scholar] [CrossRef]
  29. Zhang, M.Z.; Chen, T.E.; Gu, X.H.; Kuai, Y.; Wang, C.; Chen, D.; Zhao, C.J. UAV-borne hyperspectral estimation of nitrogen content in tobacco leaves based on ensemble learning methods. Comput. Electron. Agric. 2023, 211, 11. [Google Scholar] [CrossRef]
  30. Huang, L.S.; Li, T.K.; Ding, C.L.; Zhao, J.L.; Zhang, D.Y.; Yang, G.J. Diagnosis of the Severity of Fusarium Head Blight of Wheat Ears on the Basis of Image and Spectral Feature Fusion. Sensors 2020, 20, 2887. [Google Scholar] [CrossRef]
  31. Hayit, T.; Erbay, H.; Varçin, F.; Hayit, F.; Akci, N. The classification of wheat yellow rust disease based on a combination of textural and deep features. Multimed. Tools Appl. 2023, 82, 47405–47423. [Google Scholar] [CrossRef]
  32. Liao, J.; Wang, Y.; Zhu, D.Q.; Zou, Y.; Zhang, S.; Zhou, H.Y. Automatic Segmentation of Crop/Background Based on Luminance Partition Correction and Adaptive Threshold. IEEE Access 2020, 8, 202611–202622. [Google Scholar] [CrossRef]
  33. Zhang, H.S.; Huang, L.S.; Huang, W.J.; Dong, Y.Y.; Weng, S.Z.; Zhao, J.L.; Ma, H.Q.; Liu, L.Y. Detection of wheat Fusarium head blight using UAV-based spectral and image feature fusion. Front. Plant Sci. 2022, 13, 14. [Google Scholar] [CrossRef]
  34. Yang, H.; Lan, Y.; Lu, L.; Gong, D.; Miao, J.; Zhao, J. New method for cotton fractional vegetation cover extraction based on UAV RGB images. Int. J. Agric. Biol. Eng. 2022, 15, 172–180. [Google Scholar] [CrossRef]
  35. Estévez, J.; Salinero-Delgado, M.; Berger, K.; Pipia, L.; Rivera-Caicedo, J.P.; Wocher, M.; Reyes-Muñoz, P.; Tagliabue, G.; Boschetti, M.; Verrelst, J. Gaussian processes retrieval of crop traits in Google Earth Engine based on Sentinel-2 top-of-atmosphere data. Remote Sens. Environ. 2022, 273, 19. [Google Scholar] [CrossRef]
  36. Aslan, M.F.; Durdu, A.; Sabanci, K.; Ropelewska, E.; Gueltekin, S.S. A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses. Appl. Sci. 2022, 12, 1047. [Google Scholar] [CrossRef]
  37. Li, L.Y.; Mu, X.H.; Jiang, H.L.; Chianucci, F.; Hu, R.H.; Song, W.J.; Qi, J.B.; Liu, S.Y.; Zhou, J.X.; Chen, L.; et al. Review of ground and aerial methods for vegetation cover fraction (fCover) and related quantities estimation: Definitions, advances, challenges, and future perspectives. ISPRS-J. Photogramm. Remote Sens. 2023, 199, 133–156. [Google Scholar] [CrossRef]
  38. Xie, J.; Zhou, Z.; Zhang, H.; Zhang, L.; Li, M. Combining Canopy Coverage and Plant Height from UAV-Based RGB Images to Estimate Spraying Volume on Potato. Sustainability 2022, 14, 6473. [Google Scholar] [CrossRef]
  39. Njane, S.N.; Tsuda, S.; van Marrewijk, B.M.; Polder, G.; Katayama, K.; Tsuji, H. Effect of varying UAV height on the precise estimation of potato crop growth. Front. Plant Sci. 2023, 14, 1233349. [Google Scholar] [CrossRef]
Figure 1. Location and division of the two research fields and UAV-based field observations. (a) Location of the experimental areas in Shaanxi Province; (b,c) aerial views of Experiment Area A and Experiment Area B.
Figure 1. Location and division of the two research fields and UAV-based field observations. (a) Location of the experimental areas in Shaanxi Province; (b,c) aerial views of Experiment Area A and Experiment Area B.
Agronomy 14 01620 g001
Figure 2. Scatter plots of potato plants, soil, shadows, and drip irrigation strips. (a) Red-green band combination; (b) blue-green band combination; (c) blue-red band combination. Green dots represent plants, yellow dots represent soil, black dots represent shadows, and purple dots represent drip irrigation tape.
Figure 2. Scatter plots of potato plants, soil, shadows, and drip irrigation strips. (a) Red-green band combination; (b) blue-green band combination; (c) blue-red band combination. Green dots represent plants, yellow dots represent soil, black dots represent shadows, and purple dots represent drip irrigation tape.
Agronomy 14 01620 g002
Figure 3. Fitting results of BGCI and RGCI vegetation indices. (a) Blue-green band combination; (b) blue-red band combination.
Figure 3. Fitting results of BGCI and RGCI vegetation indices. (a) Blue-green band combination; (b) blue-red band combination.
Agronomy 14 01620 g003
Figure 4. Validation results of the confusion matrix for FVC extraction thresholds at different growth stages of potatoes. (a) Tuber formation stage; (b) tuber expansion stage; (c) tuber maturation stage.
Figure 4. Validation results of the confusion matrix for FVC extraction thresholds at different growth stages of potatoes. (a) Tuber formation stage; (b) tuber expansion stage; (c) tuber maturation stage.
Agronomy 14 01620 g004
Figure 5. Process of potato plant threshold extraction based on the vegetation index threshold method during the tuber formation period. (a) RGCI; (b) BGCI; (c) EXG.
Figure 5. Process of potato plant threshold extraction based on the vegetation index threshold method during the tuber formation period. (a) RGCI; (b) BGCI; (c) EXG.
Agronomy 14 01620 g005
Figure 6. Process of potato plant threshold extraction based on the vegetation index threshold method during the tuber expansion period. (a) RGCI; (b) BGCI; (c) EXG.
Figure 6. Process of potato plant threshold extraction based on the vegetation index threshold method during the tuber expansion period. (a) RGCI; (b) BGCI; (c) EXG.
Agronomy 14 01620 g006
Figure 7. Process of potato plant threshold extraction based on the vegetation index threshold method during the maturation stage. (a) RGCI; (b) BGCI; (c) EXG.
Figure 7. Process of potato plant threshold extraction based on the vegetation index threshold method during the maturation stage. (a) RGCI; (b) BGCI; (c) EXG.
Agronomy 14 01620 g007
Figure 8. Extraction results of potato vegetation coverage based on the vegetation index intersection method.
Figure 8. Extraction results of potato vegetation coverage based on the vegetation index intersection method.
Agronomy 14 01620 g008
Figure 9. Confusion matrix verification results of potato vegetation coverage extraction accuracy based on the vegetation index intersection method.
Figure 9. Confusion matrix verification results of potato vegetation coverage extraction accuracy based on the vegetation index intersection method.
Agronomy 14 01620 g009
Figure 10. Extraction results of FVC based on the BGCI and RGCI vegetation index intersection methods. (a) Tuber formation period; (b) tuber expansion period; (c) tuber maturation period.
Figure 10. Extraction results of FVC based on the BGCI and RGCI vegetation index intersection methods. (a) Tuber formation period; (b) tuber expansion period; (c) tuber maturation period.
Agronomy 14 01620 g010
Figure 11. Results of feature selection using the Pearson correlation coefficient method in the entire stage.
Figure 11. Results of feature selection using the Pearson correlation coefficient method in the entire stage.
Agronomy 14 01620 g011
Figure 12. Results of model accuracy in the entire stage. (a) BGCI; (b) NGBDI; (c) RGCI; (d) RGBVI; (e) NGRVI; (f) EXG.
Figure 12. Results of model accuracy in the entire stage. (a) BGCI; (b) NGBDI; (c) RGCI; (d) RGBVI; (e) NGRVI; (f) EXG.
Agronomy 14 01620 g012
Figure 13. Comparison of the extraction accuracy of potato FVC using the three threshold extraction methods. (a) Tuber formation stage; (b) tuber expansion stage; (c) tuber maturation stage.
Figure 13. Comparison of the extraction accuracy of potato FVC using the three threshold extraction methods. (a) Tuber formation stage; (b) tuber expansion stage; (c) tuber maturation stage.
Agronomy 14 01620 g013
Figure 14. Comparison of the vegetation index misclassification results. The orange circle indicates areas of high vegetation coverage, the yellow circle represents areas of low vegetation coverage.
Figure 14. Comparison of the vegetation index misclassification results. The orange circle indicates areas of high vegetation coverage, the yellow circle represents areas of low vegetation coverage.
Agronomy 14 01620 g014
Figure 15. Grayscale images of the five vegetation indices.
Figure 15. Grayscale images of the five vegetation indices.
Agronomy 14 01620 g015
Figure 16. Histogram of the EXG.
Figure 16. Histogram of the EXG.
Agronomy 14 01620 g016
Figure 17. Validation of the accuracy of the FVC estimation model established by the RGCI and BGCI. (a) BGCI; (b) RGCI.
Figure 17. Validation of the accuracy of the FVC estimation model established by the RGCI and BGCI. (a) BGCI; (b) RGCI.
Agronomy 14 01620 g017
Table 1. Statistical results of the pixel values of potato vegetation, soil, drip irrigation belts, and shadows in the blue, green, and red bands.
Table 1. Statistical results of the pixel values of potato vegetation, soil, drip irrigation belts, and shadows in the blue, green, and red bands.
Typical ObjectsBlue Band Pixel ValuesGreen Band Pixel ValuesRed Band Pixel Values
AverageStandard DeviationAverageStandard DeviationAverageStandard Deviation
Potato84.01020.734135.85821.96279.52320.165
Soil219.78312.291204.62112.597173.84816.144
Shadow83.11433.93475.34430.68152.28327.783
Drip irrigation tape146.10837.974136.68437.058121.77838.578
Table 2. Names of 24 texture features.
Table 2. Names of 24 texture features.
Texture Feature NameRed BandGreen BandBlue Band
MeanR-MeanG-MeanB-Mean
VarianceR-VarianceG-VarianceB-Variance
HomogeneityR-HomogeneityG-HomogeneityB-Homogeneity
ContrastR-ContrastG-ContrastB-Contrast
DissimilarityR-DissimilarityG-DissimilarityB-Dissimilarity
EntropyR-EntropyG-EntropyB-Entropy
Second momentR-Second momentG-Second momentB-Second moment
CorrelationR-CorrelationG-CorrelationB-Correlation
Table 3. Threshold extraction results based on the vegetation index threshold method.
Table 3. Threshold extraction results based on the vegetation index threshold method.
Vegetation IndexThreshold Extraction Results
Tuber Formation StageTuber Expansion StageTuber Maturation Stage
EXG29.56751.64230.342
GLI0.0710.1110.075
GRVI0.0020.0530.004
TRVI−13.987−10.125−20.482
NGBDI0.0980.1630.149
RGBVI0.1430.2220.169
RGFI−6.82516.943−6.698
MGRVI0.0740.1110.014
NGRVI−3.951−5.209−1.759
RGCI2.58923.05816.936
BGCI−13.05810.180−4.300
Table 4. Threshold extraction results based on the vegetation index threshold method.
Table 4. Threshold extraction results based on the vegetation index threshold method.
Vegetation IndexImage Acquisition PeriodOverall Accuracy (%)Kappa Coefficient
GLITuber formation period99.120.9210
Tuber expansion period99.760.9951
TRVITuber formation period96.810.7591
Tuber expansion period97.670.9525
RGCITuber formation period99.790.9798
Tuber expansion period99.770.9952
BGCITuber formation period99.970.9974
Tuber expansion period99.810.9999
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shi, X.; Yang, H.; Chen, Y.; Liu, R.; Guo, T.; Yang, L.; Hu, Y. Research on Estimating Potato Fraction Vegetation Coverage (FVC) Based on the Vegetation Index Intersection Method. Agronomy 2024, 14, 1620. https://doi.org/10.3390/agronomy14081620

AMA Style

Shi X, Yang H, Chen Y, Liu R, Guo T, Yang L, Hu Y. Research on Estimating Potato Fraction Vegetation Coverage (FVC) Based on the Vegetation Index Intersection Method. Agronomy. 2024; 14(8):1620. https://doi.org/10.3390/agronomy14081620

Chicago/Turabian Style

Shi, Xiaoyi, Huanbo Yang, Yiwen Chen, Runfeng Liu, Taifeng Guo, Liangliang Yang, and Yaohua Hu. 2024. "Research on Estimating Potato Fraction Vegetation Coverage (FVC) Based on the Vegetation Index Intersection Method" Agronomy 14, no. 8: 1620. https://doi.org/10.3390/agronomy14081620

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop