Next Article in Journal
The Impact of High-Standard Farmland Construction Policy on Grain Quality from the Perspectives of Technology Adoption and Cultivated Land Quality
Previous Article in Journal
Harvesting Route Detection and Crop Height Estimation Methods for Lodged Farmland Based on AdaBoost
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Citrus Canopy SPAD Prediction under Bordeaux Solution Coverage Based on Texture- and Spectral-Information Fusion

College of Geomatics and Geoinformation, Guilin University of Technology, Guilin 541006, China
*
Author to whom correspondence should be addressed.
Agriculture 2023, 13(9), 1701; https://doi.org/10.3390/agriculture13091701
Submission received: 26 July 2023 / Revised: 17 August 2023 / Accepted: 26 August 2023 / Published: 29 August 2023
(This article belongs to the Section Digital Agriculture)

Abstract

:
Rapid and nondestructive prediction of chlorophyll content and response to the growth of various crops using remote sensing technology is a prominent topic in agricultural remote sensing research. Bordeaux mixture has been extensively employed for managing citrus diseases, such as black star and ulcer disease. However, the presence of pesticide residues in Bordeaux mixture can significantly modify the spectral response of the citrus canopy, thereby exerting a substantial influence on the accurate prediction of agronomic indices in fruit trees. In this study, we used unmanned aerial vehicle (UAV) multispectral imaging technology to obtain remote sensing imagery of Bordeaux-covered citrus canopies during the months of July, September, and November. We integrated spectral and texture information to construct a high-dimensional feature dataset and performed data downscaling and feature optimization. Furthermore, we established four machine learning models, namely, partial least squares regression (PLS), ridge regression (RR), ridge, random forest (RF), and support vector regression (SVR). Our objectives were to identify the most effective prediction model for estimating the SPAD (soil plant analysis development) value of Bordeaux-covered citrus canopies, assess the variation in prediction accuracy between fused features and individual features, and investigate the impact of Bordeaux solution on the spectral reflectance of the citrus canopy. The results showed that (1) the impact of Bordeaux mixture on citrus canopy reflectance bands ranked from the highest to the lowest as follows: near-infrared band at 840 nm, red-edge band at 730 nm, blue band at 450 nm, green band at 560 nm, and red band at 650 nm. (2) Fused feature models had better prediction ability than single-feature modeling, with an average R2 value of 0.641 for the four model test sets, improving by 0.117 and 0.039, respectively, compared with single-TF (texture feature) and -VI (vegetation index) modeling, and the test-set root-mean-square error (RMSE) was 2.594 on average, which was 0.533 and 0.264 lower than single-TF and -VI modeling, respectively. (3) Multiperiod data fusion effectively enhanced the correlation between features and SPAD values and consequently improved model prediction accuracy. Compared with accuracy based on individual months, R improved by 0.013 and 0.011, while RMSE decreased by 0.112 and 0.305. (4) The SVR model demonstrated the best performance in predicting citrus canopy SPAD under Bordeaux solution coverage, with R2 values of 0.629 and 0.658, and RMSE values of 2.722 and 2.752 for the training and test sets, respectively.

1. Introduction

China is a major citrus-producing and -consuming country, and the citrus industry is a pillar and characteristic industry in many southern provinces [1]. Chlorophyll, a vital component in citrus growth [2], plays a crucial role in assessing the health status of citrus growth, nutrient stress levels, and irrigation and fertilization requirements. Hence, the rapid and nondestructive determination of relative chlorophyll content in citrus fruit trees has great significance.
Bordeaux mixture (Bm), a copper-based pesticide, has been extensively employed for controlling citrus foliar diseases, such as black star and ulcer diseases, owing to its capacity to effectively manage disease symptoms, enhance leaf coloration, stimulate vigorous growth, and improve disease resistance [3]. Previous studies have demonstrated that the presence of Bordeaux mixture on dried citrus canopy leaves can modify the spectral response of the citrus canopy, resulting in significant implications for remote sensing-based prediction of agronomic indicators in fruit trees [4,5]. However, the specific spectral effect and its magnitude remain to be further investigated, and there is currently a lack of research focused on monitoring SPAD values within citrus canopies covered with Bordeaux mixture. To optimize the management of citrus fruit trees, it is recommended to apply Bordeaux mixture annually in early July. The period from July to November is crucial because it represents the stable fruiting phase, strong fruiting stage, and fruit harvesting period for citrus trees. During this time, it becomes essential to closely monitor the nutritional status of fruit trees to minimize conflicts between fruit development and nutrient availability for autumnal growth. Consequently, conducting research activities that involve extracting sensitive spectral features under the influence of Bordeaux mixture, developing a model for estimating SPAD values within the citrus canopy, and enhancing its accuracy hold great practical significance.
UAV multispectral remote sensing imaging technology has been utilized in the field of plant growth health monitoring [6]. Previous findings have revealed that by constructing vegetation indices and employing inversion techniques to estimate plant growth and physiological parameters, the prediction accuracy of models can be significantly enhanced. Liu et al. studied the effect of nitrogen application on different rice varieties and different fertility stages based on UAV remote sensing and machine learning and used vegetation indices to predict multiple physiological parameters in rice [7]. Wang Dan et al. used UAV multispectral image data and selected four spectral parameters to construct a multiple linear regression model for summer maize SPAD. The coefficient of determination (R2) was 0.885, and the root-mean-square error (RMSE) was 2.111 [8]. Fu Bolin, and Niu Qinglin et al. constructed different spectral indices for target physiological parameters based on UAV data and obtained high estimation accuracy [9,10]. Nevertheless, multispectral images possess the advantage of incorporating both spectral and texture information within a unified map [11]. However, the previously mentioned studies solely focused on spectral information for prediction purposes, overlooking the abundant texture information present in multispectral images. Additionally, there was no consideration given to the potential impact of texture features on the prediction of vegetation physiological indices. In recent years, numerous scholars have made further advancements in the exploration of valuable information contained in UAV images by extracting texture features from spectral images. For example, Zhu et al. determined the best spatial resolution and optimal model for monitoring wheat black star disease by fusing vegetation indices and texture features using UAV multispectral remote sensing [12]. Guo et al. integrated spectral and texture information based on UAV RGB images to identify the summer maize cob stage [13]. Xu Yi et al. found that effectively fusing spectral and texture features improved mangrove classification accuracy compared with single spectral and texture features [14]. The existing utilization of fusion data predominantly focuses on the identification and categorization of plant species, vegetation diseases, and physiological traits. However, limited research efforts have been dedicated to citrus physiological parameters. Furthermore, there is an urgent need to validate the construction model for estimating the SPAD of citrus canopy under Bordeaux mixture cover using a fused approach that incorporates spectral and texture information.
Based on the above analysis, this study focuses on the citrus experimental base located in Gongcheng Yao Autonomous County, Guilin City. By employing UAV multispectral imaging technology, remote sensing images of citrus fruit tree canopies were acquired during the months of July, September, and November under varying levels of Bordeaux sap coverage. Subsequently, a prediction model for estimating the SPAD of the citrus canopy was developed. The main research contents include (1) analyzing the spectral variations in citrus fruit trees with different levels of Bordeaux mixture coverage; (2) integrating mapping information and generating distinct feature datasets corresponding to different levels of coverage, employing data downscaling and feature selection; (3) establishing SPAD prediction models for citrus canopies at different coverage levels and assessing the predictive accuracy of four distinct models using quantitative analysis; and (4) evaluating variations in predictive accuracy in citrus canopy SPAD at different coverage levels based on the optimal model and subsequently generating canopy SPAD prediction maps for each specified period.

2. Materials and Methods

2.1. Study Area and Field Experiments

The experimental site is situated at Peng Yu Brothers Farm’s citrus experimental base, located in Gongcheng Yao Autonomous County, Guilin City, Guangxi Zhuang Autonomous Region (24°51′42.07″ N, 110°49′7.06″ E) (Figure 1). This region has a subtropical monsoon climate with mild weather, ample sunlight, abundant rainfall, and roughly equivalent duration of rain and heat. These environmental conditions are highly conducive to the growth of citrus fruit trees. The soil composition in the study area primarily consists of red soil. The selected citrus variety for this study is Merkot, and the average tree age is approximately 6 years. The orchards receive consistent irrigation, fertilization, and spraying. A total of one hundred representative fruit trees were carefully chosen for sampling in the study area, considering several factors, such as crown size, tree height, and slope.
The experiment was conducted from 14 July 2022 to 13 November 2022. Bordeaux mixture was applied to the fruit trees on 4 July, a sunny time period, using the dilute copper-and-concentrated lime method in the study area. The ratio used was 1:2:200 for copper sulfate, lime, and water. After a period of 15 days, a second spraying operation was carried out. The Bordeaux mixture applied to the fruit trees dried up and formed a white solid substance that adhered to the citrus leaves. Consequently, during the study period, different levels of Bordeaux solution coverage were observed on the canopies of fruit trees in the study area. These levels of coverage were categorized as heavy, moderate, and light for the months of July, September, and November, respectively.

2.2. Multispectral Image Acquisition

In this study, a DJI Phantom 4 multispectral drone is utilized to capture multispectral imagery. The drone is equipped with six 1-inch CMOS sensors, each having resolution of 20,000 pixels. Among these sensors, one is an RGB sensor used for visible imaging, while the remaining five are monochrome sensors employed for multispectral imaging. The maximum takeoff weight of the drone is 1487 g, and it can fly continuously for approximately 27 min. Detailed information regarding the spectral sensors can be found in Table 1.
Multispectral images were acquired from 12:00 noon to 1:00 p.m. on 14 July 2022, 7 September 2022, and 13 November 2022, respectively. The weather on the experimental days was clear, with no or few clouds and an open view. The UAV was flown at an altitude of 80 m, with an intended overlap rate of 80% in the heading direction and 60% in the side direction. The camera lens was positioned vertically downward to capture the imagery, and the ground sampling distance (GSD) was approximately 4.16 cm per pixel. In addition, a standardized calibration plate was positioned on a well-illuminated concrete road within the designated test region for subsequent radiometric correction and mitigation of insolation variations. Subsequently, the acquired UAV images were mosaiced, corrected, and clipped with Pix4d and ENVI5.3 software. After the above processing, images with blue, green, red, red-edge, and near-infrared bands were obtained. Figure 2 illustrates the true-color images of the study area obtained during the months of July, September, and November.

2.3. SPAD Measurements

To match the multispectral imagery, the central location of each individual sampled fruit tree was recorded with real-time kinematic (RTK) software. Simultaneously, the chlorophyll content of the samples was measured using a handheld chlorophyll meter, specifically the SPAD-502Plus model, synchronized with the acquisition of multispectral images by the UAV. The SPAD of each sample tree was measured in four horizontal directions, east, west, north, and south, as well as vertically in the upper, middle, and lower directions. Alcohol was used to wipe off the Bordeaux coagulant on the surface of the leaves; then, the SPAD values were measured after the alcohol evaporated for 3 min. A sample of 100 SPAD values was taken in each period for a total of 3 periods, and 300 SPAD data points were obtained in total.
The data of each period were divided in a ratio of 7:3, and 210 training samples and 90 test samples were obtained for preliminary SPAD processing. The characteristic statistics are presented in Table 2.

2.4. Research Methodology

To achieve accurate SPAD prediction in citrus canopies covered with Bordeaux liquid, the following data processing methods are used: First, we fuse spectral and texture information of multispectral images, construct high-dimensional datasets, and perform data downscaling and feature optimization. Second, we construct a citrus canopy SPAD estimation model, select the RMSE and R2 to compare and analyze the estimation accuracy of different models, and then determine the best SPAD estimation model. Third, we verify the differences in SPAD estimation using image data of different Bordeaux liquid coverage levels and determine the best models for SPAD estimation with the corresponding coverage level. Finally, a thematic map depicting the distribution of SPAD for each coverage level within the study area is generated. The UAV, calibration plate, and SPAD-502Plus handheld chlorophyll meter are shown in Figure 3.

2.4.1. Calculation of the Vegetation Index

Twenty classical vegetation indices were selected [15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32], as shown in Appendix A, Table A1.
Given the Bordeaux mixture effect on the spectral response, classical vegetation spectral indices utilize a limited amount of characteristic band information (e.g., red-edge, red, and green bands). As a result, these indices fail to capture complete valid band information. In contrast, the combined vegetation index incorporates all band information from the multispectral image, and arranges and combines it to enhance the estimation of citrus canopy SPAD. Therefore, 265 combined vegetation indices were obtained after the raw bands of UAV images were subjected to multiple scattering correction (MSC), using Equations (1)–(5) for permutation and combination. When computing the combined vegetation indices, any duplicated indices between the combined vegetation indices and the five classical vegetation indices were excluded to ensure their mutual exclusivity. An index of the formulas resulting from the calculations can be found in Appendix B, Table A2.
C V 1 = B i / B j
C V 2 = B i B j
C V 3 = B i B j / B m
C V 4 = B i B j B m B n
C V 5 = B i B j B m + B n
where CV1 to CV5 represent the five general equations for calculating the combined vegetation index; I, j, m, and n take values from 1 to 5 to represent the five spectral bands of blue, green, red, red-edge and near-infrared of the UAV images, respectively; and B is the reflectance of the band.

2.4.2. Extraction of Texture Features

In this study, multispectral texture features (TFs) were computed by applying gray-level co-occurrence matrix (GLCM) calculation. The GLCM yielded eight distinct texture features, namely, mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment, and correlation. To extract the texture features, the window size was selected as 7 × 7; the step size was set to 3; the orientation was set to the 45° direction; and the gray level was set to 64. Figure 4 shows the texture feature maps of the images in each month. Columns are texture features; rows are bands (two rows are a band); and each band contains 8 texture features.

2.4.3. Fusing Texture and Spectral Information to Construct Feature Datasets

A citrus canopy SPAD feature dataset was constructed by combining classical vegetation indices (20), combined vegetation indices (265), and texture features (40). The Python 3.7 environment with PyCharm Community Edition 2020.2.1 x64 and scikit-learn library was used to partition the data in a ratio of 7:3 for each period and preprocess the data. The preprocessing steps employed in the study include (1) application of multiple scattering correction (MSC) to the original bands to mitigate spectral differences arising from different scattering levels caused by Bordeaux liquid, thereby enhancing the correlation between spectral and SPAD values; (2) data normalization using the function from the scikit-learn library to perform min–max normalization on the citrus canopy SPAD feature dataset; and (3) utilization of Box–Cox transformation and logarithmic transformation to enhance the normality, symmetry, and variance equality of the data.

2.4.4. Data Dimensionality Reduction and Feature Filtering

Correlation is a statistical measure that quantifies the degree of association between two variables. In this study, we employed the Pearson correlation coefficient to assess the relationship between individual features and SPAD values. Specifically, we computed the correlation coefficients for each feature in relation to SPAD values and then identified the top 10 vegetation indices and 5 texture features with the highest correlations for each data period. Selecting features based on their correlation strengths, we aimed to identify the most relevant and informative variables that exhibited a strong link with SPAD values during different periods of data collection.

2.4.5. Construction and Accuracy Verification of the Citrus Canopy SPAD Prediction Model

To investigate the impact of different feature variables, namely, single-index features, single-texture features, and atlas information fusion features, on the accuracy of citrus canopy SPAD prediction, this study developed four prediction models using machine learning algorithms. The models employed in this research include partial least squares regression (PLS), ridge regression (RR), random forest (RF), and support vector regression (SVR). These models were constructed to analyze and compare their performance in predicting the SPAD values of the citrus canopy. By integrating various feature variables, we aimed to evaluate the effectiveness of each model in accurately predicting SPAD values, providing valuable insights into the relationship among different feature types and the accuracy of SPAD predictions in citrus canopy analysis.
The PLS method combines the advantages of three analysis methods: principal component analysis, typical correlation analysis, and multiple linear regression analysis. The partial least squares method is often used to predict data in remote sensing and other disciplines because it is more efficient and useful [33].
Ridge regression is a biased estimation regression method that obtains more realistic and reliable regression coefficients at the cost of losing some information and reducing accuracy by adding L2 regularity terms [34].
The RF regression model uses decision trees as the base learner to construct bagging integration based on the introduction of random feature selection to construct decision trees for regression prediction [35]. In this paper, we set the number of trees (n_estimators) to 104, the maximum depth (max_depth) and minimum split samples (min_samples_split) to 3, and the minimum leaf node (min_samples_leaf) to 1.
SVR is a machine learning method based on statistical theory [36]. Based on support vector machines, a sensitivity loss function is introduced to convert the classification task into a regression task [37]. In this paper, the rbf kernel function was used to model support vector regression by adjusting penalty coefficients C and gamma with the learning curve, and the larger the value of C, the more sensitive relaxation variable ξ. The value of C was set to 111.1; gamma was set to 1.0; and the default value of the loss function parameter was selected.
In this study, based on the generalized cross-validation (grid search CV) criterion, the K-fold function in the scikit-learn library was used to perform 5-fold cross-validation, and the cross_val_score function was used to score cross-validation for each model to improve model robustness.
The evaluation of model fitting performance involved the utilization of two indicators: the coefficient of determination (R2) and the root-mean-square error (RMSE). The calculation formulas for these indicators are provided in Equations (6) and (7). A higher value of R2 indicates a better fit of the model, reflecting the extent to which the observed variability is captured by the model. On the other hand, RMSE values closer to zero signify smaller errors between the predicted values and the corresponding measured values, indicating a higher level of prediction accuracy.
R 2 = i = 1 n y i ~ y ¯ 2 i = 1 n y i y ¯ 2
R M S E = i = 1 n ( y ~ y i ) 2 / n
where n is the number of samples; i = 1, 2, 3,..., n; y i ~ and y i represent the predicted and measured SPAD values, respectively; and y ¯ is the average of the measured SPAD values.

3. Results and Analysis

3.1. Spectral Response Analysis Considering Bordeaux Cover

By comparing the multispectral data collected at three different times, variations in spectral response were observed as the Bordeaux coverage naturally decreased. Figure 5 and Table 3 illustrate the spectral reflectance of the citrus canopy during the three periods after implementing multiple scattering correction. Figure 5a–c are taken from 100 samples in July, September, and November, respectively.
As depicted in Figure 5d and Table 3, the spectral reflectance of the canopy showed similar trends in July, September, and November. However, there were some differences in the values. When the Bordeaux mixture was most abundant in July, the spectral reflectance at wavelengths of 450 nm, 560 nm, 650 nm, 730 nm, and 840 nm exhibited significantly higher values than that in September and November. This observation suggests that the reflectance in all five bands increased under the influence of the white Bordeaux mixture. The near-infrared reflectance (840 nm) in July was 0.1217 and 0.1469 higher than that in September and November, respectively. The reflectance of red (650 nm) was the least variable. The reflectance of the canopy in September and November was more similar overall, except at 560 nm and 650 nm. Spectral reflectance of the 450 nm, 730 nm, and 840 nm bands was higher in September than in November, with the highest difference being located at 840 nm, followed by 450 nm, and the smallest difference was at 650 nm.
From the band analysis, we concluded that the spectral reflectance of the 450 nm, 730 nm, and 840 nm bands decreased with the decrease in Bordeaux cover. In July and September, the spectral reflectance at 450 nm was higher than that at 560 nm, while the opposite trend was observed in November, showing the spectral characteristics of green plants. These results demonstrate that the Bordeaux mixture impact on the blue band (450 nm) is more pronounced than that on the green band (560 nm). The range of spectral reflectance was found to be the smallest at 650 nm with a value of 0.0249, whereas it exhibited the largest variation at 840 nm with a magnitude of 0.1469. These results indicate that the Bordeaux mixture’s impact on the red band (650 nm) was minimal, while it exerted the most pronounced influence on the NIR band (840 nm). According to the above analysis, the wavebands influenced by Bordeaux coverage can be ranked in descending order as follows: near-infrared (840 nm), red-edge (730 nm), blue (450 nm), green (560 nm), and red (650 nm) bands.

3.2. Feature Correlations under Different Bordeaux Coverage Amounts

Table 4 represents the 10 vegetation indices and 5 texture features that were chosen using the correlation analysis of the entire sample at various levels of Bordeaux coverage. At the heavy-coverage degree, it was observed that all selected vegetation indices exhibited a positive correlation, with an average correlation coefficient of 0.3220. Additionally, all these vegetation indices were associated with the near-infrared (NIR) band. The selected texture features were exclusively related to red-light textures and demonstrated positive correlations except for homogeneity. The highest correlation coefficient was found between variance and the coverage degree, yielding an R value of 0.3266, while contrast exhibited the lowest correlation, with an R value of 0.2720. Under medium coverage, the selected vegetation indices displayed a positive correlation with the near-infrared (NIR) band, with an average correlation coefficient of 0.3532. In comparison to the heavy-coverage conditions, these correlations showed an average increase of 0.0312. The selected texture features primarily represented green textures, with the second moment exhibiting the highest correlation. At the light-coverage degree, the majority of the selected vegetation indices were associated with the red-edge band and demonstrated negative correlations, with an average absolute value of 0.5801. Similarly, all the selected texture features represented red-light textures and also exhibited negative correlations. The observation reveals that as the coverage of Bordeaux liquid decreases, there is a gradual increase in the correlation between the vegetation index and SPAD measurements, while there is a gradual decrease in the correlation between texture features and SPAD measurements.
By utilizing the complete dataset for feature extraction, the correlations between the vegetation index and SPAD measurements were further examined. The vegetation index that exhibited the highest correlation was (NIR-green)/edge, with an R value of 0.7526 and a mean R value of 0.7402. Notably, all vegetation indices demonstrated associations with NIR and red-edge spectra. The correlation between texture features and SPAD was also improved, and all involved NIR and red-edge spectra. Except for the mean texture of NIR, which displayed a positive correlation, all other texture features demonstrated a negative correlation. The texture feature with the highest correlation was the second moment of the NIR spectra, with a correlation value of −0.6602 and a mean absolute R value of 0.6423.

3.3. Accuracy Evaluation of the Texture and Spectral Feature Fusion Model

The vegetation index and texture features were fused and modeled to compare and analyze the differences in prediction accuracy for different coverage degrees, and their accuracy values are shown in Table 5. Among the different coverage degrees, the RF model exhibits the highest prediction accuracy for the heavy-coverage degree, with R of 0.533 and RMSE of 1.810 in the test set. This is followed by the SVR model with a slightly lower accuracy, having R of 0.490 and RMSE of 2.088 in the test set. For a medium-coverage degree, the SVR model demonstrates the highest accuracy, with R of 0.518 and RMSE of 2.583. Under the light-coverage level, the SVR model remains the most accurate model, yielding R of 0.718 and RMSE of 2.722 for the test set. Finally, for the entire dataset, the PLS model achieves the highest accuracy, with R of 0.804 and RMSE of 2.614 in the test set. Additionally, the heavy-, medium-, and light-coverage-level test sets have average R values of 0.457, 0.398, and 0.642, respectively, with corresponding RMSE values of 2.017, 2.454, and 2.873, respectively. In relation to model accuracy, there is a significant improvement in accuracy for each model as the Bordeaux mixture coverage decreases. While the accuracy of the SVR model is slightly lower than that of the RF model for heavy coverage, the SVR model has the highest accuracy under medium and light coverage. Conversely, although the PLS model has the highest accuracy when employing the complete dataset, it is found to be less effective across the three distinct coverage levels. As a result, the SVR model is identified as the optimal model.
The SVR model trained on the complete dataset was utilized to predict the SPAD values across the three distinct coverage levels, as depicted in Table 6. Comparing the accuracy of training on a single month of data, the accuracy improves under both heavy and medium coverage for full-sample training. The R value improved by 0.013, and RMSE decreased by 0.112 under heavy coverage; the R value improved by 0.011, and RMSE decreased by 0.305 under medium coverage, while the prediction accuracy decreased under light coverage.

3.4. Validation of Different Features on the Accuracy of Citrus Canopy SPAD Prediction

To compare and analyze the effects of different features on the accuracy of citrus canopy SPAD prediction, the vegetation indices (VIs), texture features (TFs), and fused features were input into four groups of machine learning models. The results are presented in Table 7.
In terms of feature categories, both the training and validation sets exhibit an increase in R2 when utilizing data from the VIs and fused TFs. Specifically, the average increase in R2 for the training set is 0.015, while for the validation set, it is 0.039. Notably, the PLS and SVR models display the highest increase, with a value of 0.054. The R2 value of all test sets with fused data modeling algorithms is greater than 0.6, with an average of 0.641 and an average RMSE value of 2.594, indicating that the VI and TF fusion models have better prediction ability. When considering the four groups of models, the average R2 value of the test set for TFs is 0.524, and the average RMSE for TFs across the models is 3.1277. These results suggest that TFs are less effective in prediction than other feature categories. From the independent validation results, the VI and TF fusion models exhibit better fitting and prediction accuracy than other single-source data models, indicating that the multisource data fusion method has some advantages in predicting citrus canopy SPAD.

3.5. Prediction of the SPAD Distribution in Citrus Canopies

By mapping the spatial distribution of SPAD in different time periods in the study area, it is possible to effectively depict the spatial variation in citrus canopy SPAD values. This approach is valuable for monitoring the health condition of citrus fruit trees. The fused features of the three periods were calculated and input into the SVR model to predict the SPAD values pixel by pixel, and the results are shown in Figure 6. Predictions of canopy SPAD are good in all three months, and various features, such as fruit trees, grass, roads, houses, and reservoirs, are effectively distinguished. In the study area, the average SPAD of citrus fruit trees is approximately 68–70 in July and November, with normal chlorophyll content and good growth. However, lower predicted values appear on the north and south sides of the map in November, which is due to cloud obscuring when the drone images were collected. Among the three months, the SPAD value is the highest in September, with an average value of approximately 75, which is related to the fact that citrus fruit trees are in the strong fruiting and tending period in this month.

4. Discussion

Chlorophyll concentration is one of the most important nutritional indicators of crop growth [8]. In this study, significant variations in SPAD values among three specific months were observed (Table 2), wherein the mean values ranked in descending order as September (75.98), July (71.49), and November (68.49). This phenomenon might be attributed to the growth cycle of citrus trees, as September corresponds to the phase characterized by vigorous fruiting and tending activities, necessitating substantial nutrient demand (such as nitrogen, phosphorus, and potassium) for sustaining fruit development and autumnal care. This observation is consistent with the findings obtained by Yue Xuejun et al., who reported an increase in SPAD values accompanied by a decrease in nutrient concentrations, including nitrogen, phosphorus, and potassium [38].
With the development of space technology, UAV remote sensing plays an important role in modern agriculture, enabling the easy monitoring of changes in crop physiological parameters in the short and long terms [39]. By extracting multispectral data from UAV imagery, this paper found that the applied Bordeaux mixture that adhered to citrus leaves after drying had a great influence on the spectral response. As illustrated in Figure 5d, the multispectral reflectance exhibited a significant increase in July compared with September and November. This disparity can be attributed to the higher content of Bordeaux mixture, which comprises a white or off-white powder consisting of a copper sulfate mixture that enhances the spectral response, at the commencement of the experiment.
Previous studies have shown that the sensitivity spectral index varies significantly depending on crop species, study area, and environmental stress [40], and inverse modelling based on classical spectral indices has some limitations. Given the Bordeaux mixture’s influence on the spectral response, conventional vegetation spectral indices rely on a limited amount of information from specific bands (e.g., red edge, red, and green), which fails to adequately capture valid band information. Hence, this study aimed to address this limitation by constructing 265 combined vegetation indices based on Equations (1)–(5) and extracting 40 texture features. The combined information was utilized to predict the SPAD of the citrus canopy under Bordeaux mixture coverage, with fused vegetation indices (VIs) and texture features (TFs) as data sources. The R2 values for both the training and test sets improved, and the RMSE values decreased across the four groups of models. This observation is consistent with the findings obtained by Zhu et al., who employed multiple data sources for monitoring wheat black star disease [12]. VI features primarily capture alterations in pigmentation and cellular structure in affected citrus, whereas TF components reflect the external color and surface morphology of citrus [41]. The accuracy of the fused model of VIs and TFs was higher than that of the single-data-source model, which may be because the characteristics of VIs and TFs can reflect both internal and external changes in citrus caused by the application of Bordeaux mixture. Therefore, we suggest that more attention should be given to the potential combination of texture and spectral information, rather than just using spectral information, when analyzing agriculture-related applications.
Classical linear regression models have the benefits of simplicity and ease of implementation; however, they face challenges when striving for higher levels of accuracy in statistical regression [9]. For instance, multiple stepwise regression (MSR) models are influenced by several factors, including crop type, fertility period, and climatic conditions, thus limiting their applicability across a broad spectrum of applications.
The integration of machine learning algorithms with remote sensing data has gained significant attraction and demonstrated considerable success in the domain of plant remote sensing monitoring [42]. Support vector regression (SVR), as a crucial machine learning model, has several advantages. This model incorporates support vectors and decision boundaries, which exhibit robust noise resistance and deliver high accuracy, thereby facilitating superior data fitting. In this study, a citrus canopy SPAD prediction model was developed based on PLS, RR, RF, and SVR. The SVR model demonstrated superior performance compared to other models. This outcome is consistent with the findings reported by Jing et al. and Wang et al., where support vector machine models were employed to screen wheat stripe rust traits and select wheat varieties, respectively [43,44].
However, there is some room for improvement in detecting citrus canopy SPAD values under Bordeaux mixture coverage because of the limited available spectral and texture information from visible sensors. Finally, the results of this study suggest a strategy for the UAV monitoring of the seasonal SPAD differences of fruit trees, especially combining spectral and texture information for the SPAD monitoring of citrus canopies under Bordeaux mixture coverage.

5. Conclusions

This research focused on the prediction of citrus canopy SPAD in the presence of Bordeaux mixture coverage. To achieve this, remote sensing images of the citrus canopy were acquired during three periods using UAV multispectral imaging technology. By fusing information, a predictive model for citrus canopy SPAD under Bordeaux mixture coverage was developed. The following conclusions were derived from this study:
(1)
The reflectance bands of the citrus canopy affected by Bordeaux coverage from large to small are the near-infrared (840 nm), red-edge (730 nm), blue (450 nm), green (560 nm), and red (650 nm) bands.
(2)
The fused data method has better fitting and prediction accuracy than the models based on a single data source. The average R2 of the test set was 0.641, and the average RMSE was 2.594, both superior to the single-TF and -VI models. The multisource data fusion method is more suitable for UAV remote sensing to monitor the SPAD value of citrus canopies covered with Bordeaux mixture.
(3)
The optimal SPAD prediction model of citrus canopies covered with Bordeaux mixture is the SVR model fused with VIs and TFs, with a test-set R2 value of 0.658 and an RMSE value of 2.752.
(4)
Compared with the single-month test set, the accuracy of the optimal full-sample SVR-based SPAD prediction under both heavy and moderate coverage was improved, with R increasing by 0.013 and 0.011, and RMSE decreasing by 0.112 and 0.305, respectively.

Author Contributions

Conceptualization, S.D. (Shunshun Ding), J.J. and S.D. (Shiqing Dou); Data curation, M.Z., W.Z. and S.D. (Shunshun Ding); Formal analysis, J.J. and S.D. (Shunshun Ding); Funding acquisition, J.J. and S.D. (Shiqing Dou); Investigation, M.Z., W.Z., S.D. (Shunshun Ding) and S.D. (Shiqing Dou); Methodology, S.D. (Shunshun Ding), W.Z. and S.D. (Shiqing Dou); Project administration, S.D. (Shiqing Dou); Resources, S.D. (Shunshun Ding) and J.J.; Software, S.D. (Shunshun Ding), W.Z. and M.Z.; Validation, S.D. (Shunshun Ding) and S.D. (Shiqing Dou); Visualization, S.D. (Shunshun Ding) and J.J.; Writing—original draft, S.D. (Shunshun Ding), W.Z. and M.Z.; Writing—review and editing, J.J. and S.D. (Shiqing Dou). All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China (No. 42161028) and Guilin Science and Technology Bureau Development Program (No. 2020010701; No. 20210226-2).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data provided in this study can be made available upon request from the corresponding author. The data have not been made public because they are still being used for further research.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Twenty classical vegetation indices.
Table A1. Twenty classical vegetation indices.
Vegetation
Index
FormulaSource
NDVI n i r r / n i r + r [15]
RVI n i r / r [16]
RDVI ( n i r r ) / n i r + r [17]
EVI 2.5 n i r r / n i r + 6 r 7.5 b + 1 [18]
DVI n i r r [19]
SAVI n i r r / n i r + r + 0.5 / 1.5 [20]
OSAVI n i r r / n i r + r + 0.16 [21]
MSAVI 0.5 2 n i r + 1 2 n i r + 1 2 8 n i r r [22]
NLI n i r 2 r / n i r 2 + r [23]
MNLI 1.5 n i r 2 r / n i r 2 + r + 0.5 [23]
TSAVI 0.3 n i r 0.3 r 0.5 / 0.5 n i r + r 0.15 + 1.5 1 + 0.3 2 [24]
CARI e d g e / r r e d g e g 150 + r + g g e d g e g / 150 2 e d g e g / 150 2 + 1 [25]
TVI 0.5 [ 120 n i r g 200 ( r g ) ] [26]
MSR ( n i r r 1 ) / ( n i r r + 1 ) [27]
GNDVI ( n i r g ) ( n i r + g ) [28]
ARVI ( n i r 2 r + b ) / ( n i r + 2 r + b ) [29]
RECI n i r r 1 [30]
VARI ( g r ) / ( g + r b ) [30]
NDRE ( n i r e d g e ) / ( n i r + e d g e ) [31]
SIPI ( n i r b ) / ( n i r r ) [32]

Appendix B

Table A2. The 265 vegetation indices.
Table A2. The 265 vegetation indices.
blue/green
blue/red
blue/red_edge
blue/nir
green/blue
green/red
green/red_edge
green/nir
red/blue
red/green
red/red_edge
red/nir
red_edge/blue
red_edge/green
red_edge/red
red_edge/nir
nir/blue
nir/green
nir/red_edge
nir − red_edge
nir − green
nir − blue
red_edge − red
red_edge − green
red_edge − blue
red − green
red − blue
green − blue
(nir − red_edge)/blue
(nir − red_edge)/green
(nir − red_edge)/red
(nir − red_edge)/red_edge
(nir − red_edge)/nir
(nir − red)/blue
(nir − red)/green
(nir − red)/red
(nir − red)/red_edge
(nir − red)/nir
(nir − green)/blue
(nir − green)/green
(nir − green)/red
(nir − green)/red_edge
(nir − green)/nir
(nir − blue)/blue
(nir − blue)/green
(nir − blue)/red
(nir − blue)/red_edge
(nir − blue)/nir
(red_edge − red)/blue
(red_edge − red)/green
(red_edge − red)/red
(red_edge − red)/red_edge
(red_edge − red)/nir
(red_edge − green)/blue
(red_edge − green)/green
(red_edge − green)/red
(red_edge − green)/red_edge
(red_edge − green)/nir
(red_edge − blue)/blue
(red_edge − blue)/green
(red_edge − blue)/red
(red_edge − blue)/red_edge
(red_edge − blue)/nir
(red − green)/blue
(red − green)/green
(red − green)/red
(red − green)/red_edge
(red − green)/nir
(red − blue)/blue
(red − blue)/green
(red − blue)/red
(red − blue)/red_edge
(red − blue)/nir
(green − blue)/blue
(green − blue)/green
(green − blue)/red
(green − blue)/red_edge
(green − blue)/nir
(nir − red_edge)/(nir − red)
(nir − red_edge)/(nir − green)
(nir − red_edge)/(nir − blue)
(nir − red_edge)/(red_edge − red)
(nir − red_edge)/(red_edge − green)
(nir − red_edge)/(red_edge − blue)
(nir − red_edge)/(red − green)
(nir − red_edge)/(red − blue)
(nir − red_edge)/(green − blue)
(nir − red)/(nir − red_edge)
(nir − red)/(nir − green)
(nir − red)/(nir − blue)
(nir − red)/(red_edge − red)
(nir − red)/(red_edge − green)
(nir − red)/(red_edge − blue)
(nir − red)/(red − green)
(nir − red)/(red − blue)
(nir − red)/(green − blue)
(nir − green)/(nir − red_edge)
(nir − green)/(nir − red)
(nir − green)/(nir − blue)
(nir − green)/(red_edge − red)
(nir − green)/(red_edge − green)
(nir − green)/(red_edge − blue)
(nir − green)/(red − green)
(nir − green)/(red − blue)
(nir − green)/(green − blue)
(nir − blue)/(nir − red_edge)
(nir − blue)/(nir − green)
(nir − blue)/(red_edge − red)
(nir − blue)/(red_edge − green)
(nir − blue)/(red_edge − blue)
(nir − blue)/(red − green)
(nir − blue)/(red − blue)
(nir − blue)/(green − blue)
(red_edge − red)/(nir − red_edge)
(red_edge − red)/(nir − red)
(red_edge − red)/(nir − green)
(red_edge − red)/(nir − blue)
(red_edge − red)/(red_edge − green)
(red_edge − red)/(red_edge − blue)
(red_edge − red)/(red − green)
(red_edge − red)/(red − blue)
(red_edge − red)/(green − blue)
(red_edge − green)/(nir − red_edge)
(red_edge − green)/(nir − red)
(red_edge − green)/(nir − green)
(red_edge − green)/(nir − blue)
(red_edge − green)/(red_edge − red)
(red_edge − green)/(red_edge − blue)
(red_edge − green)/(red − green)
(red_edge − green)/(red − blue)
(red_edge − green)/(green − blue)
(red_edge − blue)/(nir − red_edge)
(red_edge − blue)/(nir − red)
(red_edge − blue)/(nir − green)
(red_edge − blue)/(nir − blue)
(red_edge − blue)/(red_edge − red)
(red_edge − blue)/(red_edge − green)
(red_edge − blue)/(red − green)
(red_edge − blue)/(red − blue)
(red_edge − blue)/(green − blue)
(red − green)/(nir − red_edge)
(red − green)/(nir − red)
(red − green)/(nir − green)
(red − green)/(nir − blue)
(red − green)/(red_edge − red)
(red − green)/(red_edge − green)
(red − green)/(red_edge − blue)
(red − green)/(red − blue)
(red − green)/(green − blue)
(red − blue)/(nir − red_edge)
(red − blue)/(nir − red)
(red − blue)/(nir − green)
(red − blue)/(nir − blue)
(red − blue)/(red_edge − red)
(red − blue)/(red_edge − green)
(red − blue)/(red_edge − blue)
(red − blue)/(red − green)
(red − blue)/(green − blue)
(green − blue)/(nir − red_edge)
(green − blue)/(nir − red)
(green − blue)/(nir − green)
(green − blue)/(nir − blue)
(green − blue)/(red_edge − red)
(green − blue)/(red_edge − green)
(green − blue)/(red_edge − blue)
(green − blue)/(red − green)
(green − blue)/(red − blue)
(nir − red_edge)/(blue + green)
(nir − red_edge)/(blue + red)
(nir − red_edge)/(blue + red_edge)
(nir − red_edge)/(blue + nir)
(nir − red_edge)/(green + red)
(nir − red_edge)/(green + red_edge)
(nir − red_edge)/(green + nir)
(nir − red_edge)/(red + red_edge)
(nir − red_edge)/(red + nir)
(nir − red_edge)/(red_edge + nir)
(nir − red)/(blue + green)
(nir − red)/(blue + red)
(nir − red)/(blue + red_edge)
(nir − red)/(blue + nir)
(nir − red)/(green + red)
(nir − red)/(green + red_edge)
(nir − red)/(green + nir)
(nir − red)/(red + red_edge)
(nir − red)/(red_edge + nir)
(nir − green)/(blue + green)
(nir − green)/(blue + red)
(nir − green)/(blue + red_edge)
(nir − green)/(blue + nir)
(nir − green)/(green + red)
(nir − green)/(green + red_edge)
(nir − green)/(red + red_edge)
(nir − green)/(red + nir)
(nir − green)/(red_edge + nir)
(nir − blue)/(blue + green)
(nir − blue)/(blue + red)
(nir − blue)/(blue + red_edge)
(nir − blue)/(blue + nir)
(nir − blue)/(green + red)
(nir − blue)/(green + red_edge)
(nir − blue)/(green + nir)
(nir − blue)/(red + red_edge)
(nir − blue)/(red + nir)
(nir − blue)/(red_edge + nir)
(red_edge − red)/(blue + green)
(red_edge − red)/(blue + red)
(red_edge − red)/(blue + red_edge)
(red_edge − red)/(blue + nir)
(red_edge − red)/(green + red)
(red_edge − red)/(green + red_edge)
(red_edge − red)/(green + nir)
(red_edge − red)/(red + red_edge)
(red_edge − red)/(red + nir)
(red_edge − red)/(red_edge + nir)
(red_edge − green)/(blue + green)
(red_edge − green)/(blue + red)
(red_edge − green)/(blue + red_edge)
(red_edge − green)/(blue + nir)
(red_edge − green)/(green + red)
(red_edge − green)/(green + red_edge)
(red_edge − green)/(green + nir)
(red_edge − green)/(red + red_edge)
(red_edge − green)/(red + nir)
(red_edge − green)/(red_edge + nir)
(red_edge − blue)/(blue + green)
(red_edge − blue)/(blue + red)
(red_edge − blue)/(blue + red_edge)
(red_edge − blue)/(blue + nir)
(red_edge − blue)/(green + red)
(red_edge − blue)/(green + red_edge)
(red_edge − blue)/(green + nir)
(red_edge − blue)/(red + red_edge)
(red_edge − blue)/(red + nir)
(red_edge − blue)/(red_edge + nir)
(red − green)/(blue + green)
(red − green)/(blue + red)
(red − green)/(blue + red_edge)
(red − green)/(blue + nir)
(red − green)/(green + red)
(red − green)/(green + red_edge)
(red − green)/(green + nir)
(red − green)/(red + red_edge)
(red − green)/(red + nir)
(red − green)/(red_edge + nir)
(red − blue)/(blue + green)
(red − blue)/(blue + red)
(red − blue)/(blue + red_edge)
(red − blue)/(blue + nir)
(red − blue)/(green + red)
(red − blue)/(green + red_edge)
(red − blue)/(green + nir)
(red − blue)/(red + red_edge)
(red − blue)/(red + nir)
(red − blue)/(red_edge + nir)
(green − blue)/(blue + green)
(green − blue)/(blue + red)
(green − blue)/(blue + red_edge)
(green − blue)/(blue + nir)
(green − blue)/(green + red)
(green − blue)/(green + red_edge)
(green − blue)/(green + nir)
(green − blue)/(red + red_edge)
(green − blue)/(red + nir)
(green − blue)/(red_edge + nir)

References

  1. Wu, Y.; Chen, Y.; Liao, X.; Liao, Y.; Gao, C.; Guan, H.; Yu, C. Study on the ldentification Method of Citrus Leaves Based on Hyperspectral lmaging Technique. Spectrosc. Spectr. Anal. 2021, 41, 3837–3843. [Google Scholar]
  2. Yue, X.; Ling, K.; Hong, T.; Gan, H.; Liu, Y.; Wang, L. Distribution Model of Chlorophyll Content for Longan Leaves Based on Hyperspectral lmaging Technology. Trans. Chin. Soc. Agric. Mach. 2018, 49, 18–25. [Google Scholar]
  3. Lu, F.; Hu, P.; Lin, M.; Ye, X.; Chen, L.; Huang, Z. Photosynthetic Characteristics and Chloroplast Ultrastructure Responses of Citrus Leaves to Copper Toxicity Induced by Bordeaux Mixture in Greenhouse. Int. J. Mol. Sci. 2022, 23, 9835. [Google Scholar] [CrossRef] [PubMed]
  4. Ye, W.; Yan, T.; Zhang, C.; Duan, L.; Chen, W.; Song, H.; Zhang, Y.; Xu, W.; Gao, P. Detection of Pesticide Residue Level in Grape Using Hyperspectral Imaging with Machine Learning. Foods 2022, 11, 1609. [Google Scholar] [CrossRef]
  5. Ren, Z.Q.; Rao, Z.H.; Ji, H.Y. Identification of Different Concentrations Pesticide Residues of Dimethoate on Spinach Leaves by Hyperspectral Image Technology. IFAC Pap. 2018, 51, 758–763. [Google Scholar]
  6. Tian, J.; Yang, Z.; Feng, K.; Ding, X. Prediction of Tomato Canopy SPAD Based on UAV Multispectral lmage. Trans. Chin. Soc. Agric. Mach. 2020, 51, 178–188. [Google Scholar]
  7. Liu, S.; Zhang, B.; Yang, W.; Chen, T.; Zhang, H.; Lin, Y.; Tan, J.; Li, X.; Gao, Y.; Yao, S.; et al. Quantification of Physiological Parameters of Rice Varieties Based on Multi-Spectral Remote Sensing and Machine Learning Models. Remote Sens. 2023, 15, 453. [Google Scholar] [CrossRef]
  8. Wang, D.; Zhao, P.; Sun, J.; Niu, L.; Liu, B. Inversion of Chlorophyll Content in Summer Maize Based on UAV Multi-Spectrum. Shandong Agric. Sci. 2021, 53, 121–126+132. [Google Scholar]
  9. Fu, B.; Sun, J.; Li, Y.; Zuo, P.; Deng, T.; He, H.; Fan, D.; Gao, E. Mangrove LAI estimation based on remote sensing images and machine learning algorithms. Trans. Chin. Soc. Agric. Eng. 2022, 38, 218–228. [Google Scholar]
  10. Niu, Q.; Feng, H.; Zhou, X.; Zhu, J.; Yong, B.; Li, H. Combining UAV Visible Light and Multispectral Vegetation Indices for Estimating SPAD Value of Winter Wheat. Trans. Chin. Soc. Agric. Mach. 2021, 52, 183–194. [Google Scholar]
  11. Gan, H. Research on the Distribution of Chlorophyll Concentration of Longan Leaves Based on Hyperspectrum; College of Engineering, South China Agricultural University: Guangzhou, China, 2018. [Google Scholar]
  12. Zhu, W.; Feng, Z.; Dai, S.; Zhang, P.; Wei, X. Using UAV Multispectral Remote Sensing with Appropriate Spatial Resolution and Machine Learning to Monitor Wheat Scab. Agriculture 2022, 12, 1785. [Google Scholar] [CrossRef]
  13. Guo, Y.H.; Fu, S.; Chen, C.R.; Bryant, X.; Li, J.; Senthilnath, H.; Sun, S.; Wang, Z.; Wu, K.; De Beurs, K. Integrating spectral and textural information for identifying the tasseling date of summer maize using UAV based RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102435. [Google Scholar] [CrossRef]
  14. Xu, Y.; Zhen, J.; Jiang, X.; Wang, J. Mangrove species classification with UAV-based remote sensing data and XGBoost. Natl. Remote Sens. Bull. 2021, 25, 737–752. [Google Scholar] [CrossRef]
  15. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  16. Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  17. Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  18. Huete, A.R.; Liu, H.Q.; Batchily, K.; Van Leeuwen, W. A comparison of vegetation indices over a global set of TM images for EOS-MODIS. Remote Sens. Environ. 1997, 59, 440–451. [Google Scholar] [CrossRef]
  19. Clevers, J. The derivation of a simplified reflectance model for the estimation of leaf area index. Remote Sens. Environ. 1988, 25, 53–69. [Google Scholar] [CrossRef]
  20. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  21. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  22. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  23. Gong, P.; Pu, R.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from Hyperion hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1355–1362. [Google Scholar] [CrossRef]
  24. Baret, F.; Guyot, G. Potentials and limits of vegetation indices for LAI and APAR assessment. Remote Sens. Environ. 1991, 35, 161–173. [Google Scholar] [CrossRef]
  25. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  26. Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  27. Chen, J.M. Evaluation of vegetation indices and a modified simple ratio for boreal applications. Can. J. Remote Sens. 1996, 22, 229–242. [Google Scholar] [CrossRef]
  28. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  29. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  30. Lichtenthaler, H.K.; Gitelson, A.; Lang, M. Non-destructive determination of chlorophyll content of leaves of a green and an aurea mutant of tobacco by reflectance measurements. J. Plant Physiol. 1996, 148, 483–493. [Google Scholar] [CrossRef]
  31. Sturtevant Cove, S.; Oechel Walter, C. Spatial variation in landscape-level CO2 and CH4 fluxes from arctic coastal tundra: Influence from vegetation, wetness, and the thaw lake cycle. Glob. Change Biol. 2013, 19, 2853–2866. [Google Scholar] [CrossRef]
  32. Blaga, L.; Josan, I.; Vasile-Herman, G.; Grama, V.; Nistor, S.; Suba, N.-S. Assessment of the Forest Health Through Remote Sensing Techniques in Valea Roșie Natura 2000 Site, Bihor County, Romania. J. Appl. Eng. Sci. 2019, 9, 207–215. [Google Scholar] [CrossRef]
  33. Feng, Z.; Song, L.; Zhang, S.; Jing, Y.; He, L.; Li, F.; Feng, W. Wheat Powdery Mildew monitoring based on information fusion of multi-spectral and thermal infrared images acquired with an unmanned aerial vehicle. Sci. Agric. Sin. 2022, 55, 890–906. [Google Scholar]
  34. Firinguetti, L.; Golam Kibria, B.M.; Araya, R. Study of partial least squares and ridge regression methods. Commun. Stat. Simul. Comput. 2017, 46, 6631–6644. [Google Scholar] [CrossRef]
  35. Zhang, T.; Long, L.; Wang, K.; Tang, H.; Yang, X.; Duan, Y.; Li, H. A novel approach for the quantitative analysis of multiple elements in steel based on laser-induced breakdown spectroscopy (LIBS) and random forest regression (RFR). J. Anal. At. Spectrom. 2014, 29, 2323–2329. [Google Scholar] [CrossRef]
  36. Tao, H.; Feng, H.; Yang, G.; Yang, X.; Liu, M.; Liu, S. Leaf area index estimation of winter wheat based on UAV imaging hyperspectral imagery. Trans. Chin. Soc. Agric. Mach. 2020, 51, 176–187. [Google Scholar]
  37. Wang, P.; Qian, H.; Zhang, R.; Han, D.; Wang, J.; Yi, M. Crop Growth monitoring and yield estimation based on deep learning: State of the art and beyond. Trans. Chin. Soc. Agric. Mach. 2022, 53, 1–14. [Google Scholar]
  38. Yue, X.; Ling, K.; Wang, L.; Cui, Z.; Lu, Y.; Liu, Y. Inversion of Potassium Content for Citrus Leaves Based on Hyperspectral and Deep Transfer Learning. Trans. Chin. Soc. Agric. Mach. 2019, 50, 186–195. [Google Scholar]
  39. Liu, L.; Dong, Y.; Huang, W.; Du, X.; Luo, J.; Huang, W.; Ma, H. Enhanced regional monitoring of wheat powdery mildew based on an instance-based transfer learning method. Remote Sens. 2019, 11, 298. [Google Scholar] [CrossRef]
  40. Liu, S.; Yu, H.; Zhang, J.; Zhou, H.; Kong, L.; Zhang, L.; Dang, J.; Sui, Y. Study on Inversion Model of Chlorophyll Content in Soybean Leaf Basedon Optimal Spectral Indices. Spectrosc. Spectr. Anal. 2021, 41, 1912–1919. [Google Scholar]
  41. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat yellow rust detection using UAV-based hyperspectral technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  42. Ma, B.; Cao, G.; Hu, C.; Chen, C. Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning. Land 2023, 12, 469. [Google Scholar] [CrossRef]
  43. Jing, X.; Zhang, T.; Bai, Z.; Huang, W. Feature selection and model construction of wheat stripe rust based on GA and SVR algorithm. Trans. Chin. Soc. Agric. Mach. 2020, 51, 253–263. [Google Scholar]
  44. Wang, J.; Zhou, Q.; Shang, J.; Liu, C.; Zhuang, T.; Ding, J.; Xian, Y.; Zhao, L.; Wang, W.; Zhou, G.; et al. UAV- and machine learning-based retrieval of wheat SPAD values at the overwintering stage for variety screening. Remote Sens. 2021, 13, 5166. [Google Scholar] [CrossRef]
Figure 1. Overview of the study area.
Figure 1. Overview of the study area.
Agriculture 13 01701 g001
Figure 2. True-color images of the study area in July, September, and November. (a) True-color image of the study area in July; (b) true-color image of the study area in September; (c) true-color image of the study area in November.
Figure 2. True-color images of the study area in July, September, and November. (a) True-color image of the study area in July; (b) true-color image of the study area in September; (c) true-color image of the study area in November.
Agriculture 13 01701 g002
Figure 3. Instruments for data acquisition. (a) unmanned aerial vehicle; (b) calibration plates; (c) SPAD-502Plus chlorophyll meter.
Figure 3. Instruments for data acquisition. (a) unmanned aerial vehicle; (b) calibration plates; (c) SPAD-502Plus chlorophyll meter.
Agriculture 13 01701 g003
Figure 4. Texture characteristics of the 3-phase multispectral images. (a) July texture features; (b) September texture features; (c) November texture features.
Figure 4. Texture characteristics of the 3-phase multispectral images. (a) July texture features; (b) September texture features; (c) November texture features.
Agriculture 13 01701 g004
Figure 5. Spectral reflectance of the citrus canopy in different periods. (a) Spectral reflectance in July; (b) spectral reflectance in September; (c) spectral reflectance in November; (d) average reflectance for the three periods.
Figure 5. Spectral reflectance of the citrus canopy in different periods. (a) Spectral reflectance in July; (b) spectral reflectance in September; (c) spectral reflectance in November; (d) average reflectance for the three periods.
Agriculture 13 01701 g005
Figure 6. Spatial prediction of SPAD values of citrus canopy in different periods in the study area; (a) July SPAD prediction; (b) September SPAD prediction; (c) November SPAD prediction.
Figure 6. Spatial prediction of SPAD values of citrus canopy in different periods in the study area; (a) July SPAD prediction; (b) September SPAD prediction; (c) November SPAD prediction.
Agriculture 13 01701 g006
Table 1. Center wavelength and radiation correction coefficient of the UAV multispectral camera.
Table 1. Center wavelength and radiation correction coefficient of the UAV multispectral camera.
Multispectral BandsCenter Wavelength (Bandwidth)/nmRadiation Correction Coefficient
Blue475 ± 200.537
Green560 ± 200.538
Red668 ± 100.537
Red edge717 ± 100.533
NIR840 ± 400.536
Table 2. Statistics of SPAD values for citrus samples.
Table 2. Statistics of SPAD values for citrus samples.
MonthSample
Number
Average
of SPAD
Standard DeviationTrain
Value Range
Train
Value std
Test
Value Range
Test
Value std
July10071.492.28[64.65, 75.75]2.29[66.08, 75.99]2.26
September10075.982.68[67.17, 83.05]2.78[69.76, 80.31]2.42
November10068.493.57[58.08, 74.47]3.39[58.54, 74.43]3.91
Whole period30071.994.22[58.08, 83.05]4.15[58.54, 80.31]4.39
Table 3. Average spectral reflectance of the citrus canopy in the 3 periods after implementing MSC.
Table 3. Average spectral reflectance of the citrus canopy in the 3 periods after implementing MSC.
MonthBlue ReflectanceGreen ReflectanceRed ReflectanceRed-Edge ReflectanceNIR Reflectance
July0.08890.08270.05570.23140.3797
September0.04170.04060.02730.15550.2580
Spectral0.02800.04780.03080.15460.2328
Table 4. Characteristics of different Bordeaux coverage samples and their correlations.
Table 4. Characteristics of different Bordeaux coverage samples and their correlations.
SamplesVIsRTFsR
Severe
samples
(NIR − edge)/(blue + NIR)0.3364red_variance0.3266
(NIR − edge)/(blue + edge)0.3357
(NIR − edge)/(NIR − green)0.3224red_entropy0.3153
(NIR − green)/(edge − green)0.3206
(NIR − edge)/(edge − green)0.3206red_dissimilarity0.2931
(NIR − edge)/(NIR − red)0.3192
(NIR − edge)/(blue + green)0.3179red_homogeneity−0.277
(NIR − red)/(edge − red)0.3162
(NIR − edge)/(edge − red)0.3162red_contrast0.2720
NIR0.3144
Middle
samples
NIR − edge0.3581blue_second moment−0.2569
(NIR − edge)/(red + NIR)0.3552
(NIR − edge)/(red + edge)0.3542edge_mean−0.1773
(NIR − edge)/(NIR − blue)0.3528
(NIR − edge)/NIR0.3524blue_variance0.1654
(NIR − edge)/(edge − blue)0.3520
(NIR − blue)/(edge − blue)0.3520blue_entropy0.1591
(NIR − edge)/(green + NIR)0.3520
(NIR − edge)/(edge + NIR)0.3515blue_homogeneity−0.1590
TSAVI0.3514
Light
samples
(edge − green)/(blue + red) − 0.5870red_entropy−0.1700
(edge − red)/(blue + green) − 0.5869
(NIR − red)/(blue + red)0.5853red_contrast−0.1619
(NIR − blue)/(red + red)0.5848
(edge − blue)/(green + red) − 0.5799red_dissimilarity−0.1574
(edge − green)/(blue + NIR) − 0.5785
(edge − green)/(red + NIR) − 0.5768red_variance−0.1515
(edge − green)/NIR − 0.5752
(edge − blue)/(red + NIR) − 0.5750red_mean−0.1503
(edge − blue)/(NIR − blue) − 0.5713
All samples(NIR − green)/edge0.7526NIR_second moment−0.6602
(NIR − edge)/green0.7495
(NIR − edge)/(green + red)0.7449edge_second moment−0.6552
(NIR − green)/(red + edge)0.7401
(NIR − red)/(green + edge)0.7389NIR_homogeneity−0.6460
(NIR − edge)/(blue + edge)0.7380
(NIR − edge)/(blue + NIR)0.7372edge_homogeneity−0.6323
(NIR − red)/edge0.7352
(NIR − edge)/(green + edge)0.7332NIR_mean0.6176
(NIR − green)/(edge + NIR)0.7327
Table 5. Prediction accuracy of fused features for different coverage levels.
Table 5. Prediction accuracy of fused features for different coverage levels.
ModelAccuracy IndexSevereMiddleLightAll Samples
TrainingTestTrainingTestTrainingTestTrainingTest
PLSR0.3960.3920.2820.3010.5520.6920.7660.804
RMSE2.1072.0832.6752.4272.8332.8262.6652.614
RRR0.3760.4130.3860.3610.6840.5450.7780.779
RMSE2.1012.0892.2332.6002.8542.8502.3552.743
RFR0.6150.5330.4250.4120.6190.6120.8060.770
RMSE1.9181.8102.5252.2062.6683.0952.6022.647
SVRR0.5400.4900.5990.5180.6060.7180.8230.781
RMSE1.9312.0882.2332.5832.7012.7222.6072.752
Table 6. Prediction accuracy of the SVR model with full-sample training.
Table 6. Prediction accuracy of the SVR model with full-sample training.
Accuracy IndexSevereMiddleLight
SVRR0.5030.5290.458
RMSE1.9762.2783.175
Table 7. Prediction accuracy of different features for citrus canopy SPAD prediction.
Table 7. Prediction accuracy of different features for citrus canopy SPAD prediction.
Feature CategoryNumber of VariablesModelTraining DataTest Data
R 2 RMSE R 2 RMSE
VIs10PLS0.5572.7630.5922.805
RR0.6022.7720.5602.753
RF0.6092.4560.6322.844
SVR0.5452.6520.6043.032
TFs5PLS0.4613.0470.5382.985
RR0.5382.9870.4643.038
RF0.5572.6160.5273.315
SVR0.4922.8000.5673.173
Vis
and TFs
15PLS0.5872.6650.6462.614
RR0.5652.3550.6102.743
RF0.5932.6020.6492.647
SVR0.6292.7220.6582.752
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ding, S.; Jing, J.; Dou, S.; Zhai, M.; Zhang, W. Citrus Canopy SPAD Prediction under Bordeaux Solution Coverage Based on Texture- and Spectral-Information Fusion. Agriculture 2023, 13, 1701. https://doi.org/10.3390/agriculture13091701

AMA Style

Ding S, Jing J, Dou S, Zhai M, Zhang W. Citrus Canopy SPAD Prediction under Bordeaux Solution Coverage Based on Texture- and Spectral-Information Fusion. Agriculture. 2023; 13(9):1701. https://doi.org/10.3390/agriculture13091701

Chicago/Turabian Style

Ding, Shunshun, Juanli Jing, Shiqing Dou, Menglin Zhai, and Wenjie Zhang. 2023. "Citrus Canopy SPAD Prediction under Bordeaux Solution Coverage Based on Texture- and Spectral-Information Fusion" Agriculture 13, no. 9: 1701. https://doi.org/10.3390/agriculture13091701

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop