Next Article in Journal
Effect of Blended Bt Corn Refuge on Corn Earworm (Lepidoptera: Noctuidae) Infestation and Grain Yield
Previous Article in Journal
Impact of Organic Fertilization Strategies on Soil Bacterial Community and Honey Pomelo (Citrus maxima) Properties
Previous Article in Special Issue
Combining UAV Multi-Source Remote Sensing Data with CPO-SVR to Estimate Seedling Emergence in Breeding Sunflowers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Cotton SPAD Based on Multi-Source Feature Fusion and Voting Regression Ensemble Learning in Intercropping Pattern of Cotton and Soybean

by
Xiaoli Wang
1,2,
Jingqian Li
1,2,
Junqiang Zhang
3,
Lei Yang
1,2,
Wenhao Cui
1,2,
Xiaowei Han
4,
Dulin Qin
5,
Guotao Han
1,2,
Qi Zhou
1,2,
Zesheng Wang
2,6,
Jing Zhao
1,2,* and
Yubin Lan
7,*
1
School of Agricultural Engineering and Food Science, Shandong University of Technology, Zibo 255000, China
2
Shandong-Binzhou Cotton Technology Backyard, Binzhou 256600, China
3
China Europe Science and Technology Innovation Park, Qingdao 266000, China
4
Binzhou Academy of Agricultural Sciences, Binzhou 256600, China
5
Shandong Agricultural Technology Extension Center, Jinan 250013, China
6
Nongxi Cotton Cooperative, Binzhou 256600, China
7
College of Electronic Engineering, South China Agricultural University, Guangzhou 510642, China
*
Authors to whom correspondence should be addressed.
Agronomy 2024, 14(10), 2245; https://doi.org/10.3390/agronomy14102245
Submission received: 4 July 2024 / Revised: 26 September 2024 / Accepted: 27 September 2024 / Published: 29 September 2024
(This article belongs to the Special Issue AI, Sensors and Robotics for Smart Agriculture—2nd Edition)

Abstract

:
The accurate estimation of soil plant analytical development (SPAD) values in cotton under various intercropping patterns with soybean is crucial for monitoring cotton growth and determining a suitable intercropping pattern. In this study, we utilized an unmanned aerial vehicle (UAV) to capture visible (RGB) and multispectral (MS) data of cotton at the bud stage, early flowering stage, and full flowering stage in a cotton–soybean intercropping pattern in the Yellow River Delta region of China, and we used SPAD502 Plus and tapeline to collect SPAD and cotton plant height (CH) data of the cotton canopy, respectively. We analyzed the differences in cotton SPAD and CH under different intercropping ratio patterns. It was conducted using Pearson correlation analysis between the RGB features, MS features, and cotton SPAD, then the recursive feature elimination (RFE) method was employed to select image features. Seven feature sets including MS features (five vegetation indices + five texture features), RGB features (five vegetation indices + cotton cover), and CH, as well as combinations of these three types of features with each other, were established. Voting regression (VR) ensemble learning was proposed for estimating cotton SPAD and compared with the performances of three models: random forest regression (RFR), gradient boosting regression (GBR), and support vector regression (SVR). The optimal model was then used to estimate and visualize cotton SPAD under different intercropping patterns. The results were as follows: (1) There was little difference in the mean value of SPAD or CH under different intercropping patterns; a significant positive correlation existed between CH and SPAD throughout the entire growth period. (2) All VR models were optimal when each of the seven feature sets were used as input. When the features set was MS + RGB, the determination coefficient (R2) of the validation set of the VR model was 0.902, the root mean square error (RMSE) was 1.599, and the relative prediction deviation (RPD) was 3.24. (3) When the features set was CH + MS + RGB, the accuracy of the VR model was further improved, compared with the feature set MS + RGB, the R2 and RPD were increased by 1.55% and 8.95%, respectively, and the RMSE was decreased by 7.38%. (4) In the intercropping of cotton and soybean, cotton growing under 4:6 planting patterns was better. The results can provide a reference for the selection of intercropping patterns and the estimation of cotton SPAD.

1. Introduction

Cotton, a crucial cash crop and fundamental raw material for the cotton textile industry, faces increasingly acute competition for land with grain crops as densely populated areas expand. Adopting cotton–grain intercropping practices can effectively mitigate the tensions between cotton cultivation and food crop production. This approach boosts overall crop productivity and safeguards ecological diversity [1,2,3]. Therefore, it is important to quickly and accurately monitor the growth of cotton under an intercropping pattern. Chlorophyll relative content, as one of the important indicators of crop growth, can reflect the crop growth status in time. Xu et al. used a chlorophyll meter to measure the leaf SPAD of diseased plants and healthy plants for comparative analysis to assess the health of cotton leaves [4]. Yin et al. measured the SPAD of cotton leaves using a chlorophyll meter to assess the overall growth of the cotton field [5]. These studies provide an effective tool for assessing the health and overall growth of individual cotton plants, which provides an important reference for the effective monitoring of cotton SPAD and helps to detect plant health problems and take appropriate management measures in time.
Although traditional cotton SPAD measurements are characterized by high accuracy and ease of operation at the individual plant level, manual plant-by-plant measurements are required in large field areas, resulting in limited coverage and high labor costs. In addition, due to the limitations of manual sampling, it is difficult to reflect the spatial heterogeneity of farmland, resulting in a lower overall accuracy in assessing the health status of large-area farmland. In contrast, UAVs are easy to operate and can effectively solve these problems; many scholars have utilized drones with various types of sensors to obtain crop data and extract features to construct models for estimating crop growth parameters [6,7,8,9]. Yan et al. constructed an SPAD estimation model for cotton by a multispectral vegetation index, combined with the radial basis function in a neural network algorithm for SPAD prediction, and the model R2 reached 0.848 [10]. He et al. used UAVs to obtain visible-light images and constructed an SPAD estimation model for maize canopies based on a visible-light vegetation index, and the optimal model of the random forest regression R2 reached 0.868 [11]. Yin et al. constructed an SPAD estimation model for winter wheat based on a multispectral vegetation index and texture features, and the optimal LSTM model had an R2 of 0.857 [12]. Some scholars have found that multi-source sensor data fusion can reflect the crop growth information from different aspects and improve the accuracy of the estimation model [13]. Some scholars used the data fusion of two sensors to establish crop inversion models [14,15]. Niu et al. based their study on a visible-light vegetation index combined with a multispectral vegetation index for wheat SPAD estimation, and the R2 of the optimal stepwise regression model was up to 0.89 [16]. Some studies used the data fusion of three sensors to construct crop inversion models [17,18]. Zhai et al. fused the UAV MS, TIR, and RGB three-sensor data features, compared with single-sensor data, based on a stacking integrated learning model to construct a maize SPAD estimation model whose accuracy R2 is 0.754, which is better than the model accuracy constructed by single-image features [19]. Some scholars fused ground-based measured data with UAV image data to establish crop inversion models. Zhai et al. combined UAV MS and TIR image features with measured SPAD data as inputs to the corn aboveground biomass estimation model, and found that the optimal CatBoost estimation model could effectively improve the estimation accuracy when these features were used as inputs, and showed the highest R2 of 0.754 in the corn trumpet stage [20]. Zou et al. combined UAV multispectral image features with plant height data, and the results showed that combining plant height data could significantly improve the estimation accuracy of the leaf area index at the nodulation stage of winter wheat, and the R2 of the optimal model XGBoost was 0.88 [21].
At present, the inversion of cotton growth parameters is mostly for cotton under a single cropping mode, and different single models are used to train and verify the characteristics. In this paper, we take cotton under different intercropping ratio modes as the research object, acquire multispectral and visible-light images of cotton during three fertility periods by UAV, extract its spectral and image features, and combine them with the height of cotton plants on the ground to propose a cotton SPAD estimation model based on voting regression ensemble learning. We evaluated the estimation accuracy of cotton SPAD by different estimation models and analyzed the effects of different ratios of intercropping between cotton and soybean on the growth of cotton, to provide a basis for the selection of the ratio of intercropping between cotton and soybean and the high-precision estimation of cotton SPAD.

2. Materials and Methods

2.1. Overview of the Study Area

The test area is located in the National Agricultural Science and Technology Park (37°34′53′ N, 118°3′49.3′ E), Bincheng District, Binzhou City, Shandong Province, China, with an altitude of about 18.62 m. It belongs to the warm temperate monsoon climate zone, with an average annual temperature of 13–15 °C, precipitation of 600–800 mm, and average annual sunshine time of about 2500–2700 h. The terrain is flat, the irrigation and drainage are convenient, and the soil is mainly yellow soil, rich in organic matter, and suitable for cotton growth. The intercropping patterns of cotton and soybean were divided into 2:2, 2:3, 2:4, 4:4, and 4:6, and the intercropping patterns of 2:3 represented two rows of cotton and three rows of soybean. Each intercropping pattern was repeated three times, and the total sown area of cotton and soybean was 3300 m2. Cotton was planted with mulch and sown on 5 May 2023. Soybeans were interplanted in cotton rows without film on June 5. Field management, such as fertilization and irrigation of cotton and soybeans, included first irrigation at the end of April to improve soil moisture, and raking in early May, before raking, 150 mL of 48% floral per hectare. After sowing cotton, about 20 kg of terpolymer fertilizer (15% nitrogen, phosphorus, and potassium) was applied, and then 150 mL of 50% butyl was sprayed on the surface and covered with mulch. After the cotton grew two true leaves, the seedlings were fixed and the soil was cultivated according to the predetermined density. Fertilization management for soybean sowing in early June was consistent with cotton fertilization management. In mid-July, 15 kg of mixed fertilizer containing 24% nitrogen, 8% phosphorus, and 10% potassium was applied, 10 kg urea was applied at the end of July, and 150 g potassium dihydrogen phosphate was applied in mid–early August. According to the situation of diseases and pests in the field, timely spraying control should be carried out. The planting conditions in the study area are shown in Figure 1 (the color-covered area in the lower-right corner of the figure shows cotton plants in different intercropping patterns, representing 2:2, 2:3, 2:4, 4:4, and 4:6, respectively, from left to right).

2.2. Data Acquisition

2.2.1. Cotton Ground Data Acquisition

Ground data mainly included cotton SPAD and cotton plant height (CH). The SPAD sampling instrument for cotton was SPAD502 Plus (produced by Minolta, Tokyo, Japan), and the SPAD of cotton plants was measured by a five-point sampling method in each cotton plot. To ensure the accuracy of the measurement results, the measurement location was the healthy mature leaves in the middle and upper part of the cotton plant. Five leaves were selected from each cotton plant, and each leaf was measured three times at the same position. The average value was taken as the SPAD of this cotton plant. In total, 50 samples were obtained at each growth stage of cotton, a total of 150 samples; at the cotton SPAD sampling position, cotton plant height was measured with a tape measure. Each plant was measured twice, and its average value was taken as the plant height of the sample point to ensure data accuracy. SPAD and plant height sampling of cotton was carried out in three growth stages: 9 July (bud stage), 2 August (early flowering stage), and 28 August (full flowering stage), 2023, respectively. Sampling results are shown in Table 1.

2.2.2. UAV Image Acquisition and Preprocessing

The UAV remote sensing image acquisition was synchronized with the ground-based cotton SPAD acquisition, and the DJI M300 (DJI, Shenzhen, China) RTK UAV with P1 (both with a resolution of 8192 × 5460 pixels) camera was used to collect the visible images of cotton and the DJI M210 with the multispectral camera MS600Pro (Yusense, Inc., Qingdao, China, with six single-band channels. The single-band channels and spectral resolutions were 450 nm@35 nm, 555 nm@25 nm, 660 [email protected] nm, 710 nm@10 nm, 840 nm@30 nm, and 940 nm, and the resolution of each channel was 1280 × 960 pixels) was used to acquire multispectral images, which were calibrated with a calibration plate for radiometric corrections before the takeoff of the UAV. The acquisition time was chosen at midday when it was clear and well lighted, and the flight altitude was set to 30 m, the speed to 2 m/s, and the heading overlap rate and the side overlap rate were both 80%. Pix4Dmapper4.5.6 software was used to stitch the UAV multispectral and visible images. ENVI5.6 software was used to preprocess the image geometry correction, cropping, and region of interest selection.

2.3. Feature Extraction

2.3.1. Cotton Canopy Spectral Feature Extraction

Vegetation index (VI) reflects the absorption and reflection of different wavelengths of light by vegetation. Through the linear combination of parameters of each sensitive band, vegetation characteristics can be retrieved. According to the existing research results, 20 vegetation indices related to cotton SPAD were selected in this paper, and the calculation formula is shown in Table 2.

2.3.2. Extraction of Texture Features of Cotton Canopy

Extracting texture features (TFs) of cotton canopy can provide information about the surface structure of the leaves, which can indirectly reflect their chlorophyll content. In this paper, texture features were calculated using the gray-level covariance matrix (GLCM), which reflects the comprehensive information of the image regarding the orientation, distance, and gray-level variation, and the mean (mea), variance (var), homogeneity (hom), contrast (con), dissimilarity (diss), entropy (ent), second moment (sec), and correlation (cor) are 8 data in total. The texture features of the five channels of the multispectral image were extracted using ENVI5.6 software with the window set to 3 × 3. A smaller window can better detect local details in the image, while a larger window may blur these details, and the angle is the default value.

2.3.3. Extraction of Cotton Cover

Cotton cover (CC) describes the proportion of cotton area per unit area and can be used to characterize cotton growth. In this paper, CC was extracted from visible-light images as a canopy structural feature to estimate cotton SPAD. The excess green index (EXG [30]) is a metric used for vegetation identification and analysis, which is calculated considering the pixel values of the green and blue channels, which highlights the characteristics of the vegetation by subtracting the value of the blue channel. The steps include the following: comparing the EXG images of cotton for each period with a predetermined threshold value using a threshold segmentation method. Pixel EXG values above the threshold are classified as vegetation, and those below the threshold are classified as soil, generating a binary image, and thereby realizing a simple but effective distinction between vegetation and soil, as shown in Figure 2. The cotton cover is calculated as shown in Formula (1).
C C = Number   of   regional   plant   pixels Total   number   of   area   pixels

2.4. Cotton SPAD Estimation Model Construction

The SPAD data were randomly assigned in a ratio of 7:3, of which 105 were training samples and 45 were validation samples. The research methods included random forest regressor (RFR), gradient boosting regressor (GBR), support vector machine regressor (SVR), and voting regression (VR) ensemble learning.
RFR utilizes multiple decision trees for integrated learning, and by randomly selecting features and samples, it can effectively handle high-dimensional data and a large number of samples and is robust to missing data and noise. GBR trains weak classifiers iteratively and adjusts the sample weights in each iteration to progressively improve the performance of the model and performs well in dealing with nonlinear relationships and data noise. SVR has efficient generalization ability and robustness to noise, by effectively finding the best hyperplane to fit the data, can handle complex nonlinear relationships in high-dimensional space, and is suitable for both small samples and nonlinear data.
The ensemble learning strategy is utilized to combine the prediction results of multiple models to improve the overall performance, where a single model is prone to overfitting or underfitting, and integrated learning can reduce this risk by combining the strengths of multiple models to improve the generalization ability of the model [39]. The voting regression method is an intuitive and effective method in integrated learning, in which each model obtains predictions based on the input data, and then the predictions are aggregated to produce a final prediction in some way. In this paper, the VR model is used to obtain the final regression output by training independent random forest, gradient boosting tree, and support vector machine models, and using soft voting to aggregate the predictions (Figure 3). This integrated approach can fully utilize the advantages of each individual model and improve the overall model robustness and prediction performance.
For soft voting, assuming that there are N models participating in the voting, and each model outputs the probability or confidence that sample x belongs to each category p i j j = 1,2 , , K , the final prediction C can be derived from the following formula.
C = a r g m a x i = 1 N ω i × p i j
where ω i is the weight of each model, which can usually be averaged (i.e., ω i = 1 / N ) or adjusted according to the performance of the model.

2.5. Evaluation of Model Accuracy

The evaluation indexes of the estimation model include the coefficient of determination (R2), root mean square error (RMSE), and relative prediction deviation (RPD). The closer the value of the R2 is to 1, the stronger the model’s ability to fit the data, and its calculation is shown in Formula (3). The smaller the RMSE value is, the smaller the prediction error of the model is. The calculation is shown in Formula (4). The value of the RPD is also a key indicator; when the RPD is less than 1.4, the model is considered to lack predictive ability; between 1.4 and 2.0, the model has estimation ability; and when the RPD is greater than 2.0, it indicates that the model has an excellent predictive ability, which is computed as shown in Formula (5).
R 2 = i = 1 n x i x ¯ 2 y i y ¯ 2 i = 1 n x i x ¯ 2 i = 1 n y i y ¯ 2
R M S E = y i y ¯ 2 n
R P D = S D R M S E × n n 1
where x i and x ¯ are the sample measurements and their average values y i and y ¯ are the sample predicted values and their average values, n is the number of samples in the verification set, and S D is the standard deviation of the predicted values.

3. Results and Analysis

3.1. Statistical Analysis of Cotton SPAD and Cotton Plant Height

The cotton SPAD and plant height statistics based on different intercropping patterns of cotton and soybean are shown in Figure 4. The mean values of SPAD were 51.39, 51.25, 50.72, 51.06, and 50.52 for the 2:2, 2:3, 2:4, 4:4, and 4:6 types of SPAD, respectively, and the mean values of plant height were 109.84, 113.49, 115.37, 113.01, and 115.4 for the five intercropping patterns at the full life span, respectively. It can be seen that the average SPAD of cotton in different intercropping patterns fluctuated between 50.52 and 51.39, showing a relatively stable level. The average plant height varied from 109.84 cm to 115.4 cm under different intercropping patterns, showing a slight variation trend.
Figure 5 shows the scatter plot between the measured cotton SPAD and CH. As can be seen from Figure 5, there is an obvious positive correlation between the cotton plant height and SPAD, and the plant height increases gradually with the increase in SPAD, indicating that the photosynthetic capacity and leaf health status of cotton plants have an important influence on their growth height.

3.2. Correlation Analysis and Feature Selection between Remote Sensing Features and Cotton SPAD

Pearson correlation analysis was performed between the selected vegetation indexes and the measured SPAD, and their correlation coefficients (r) are presented in Table 3. For the visible-light vegetation indexes, except for VARI (0.25) and GBDI (−0.34), the correlation coefficients between other vegetation indexes and cotton SPAD were all greater than 0.60, with the correlation coefficients of WI, RBRI, RBDI, NGBDI, and NPCI all exceeding 0.78. The correlation between RBRI and NPCI was the strongest at −0.87. For the multispectral vegetation indexes, except for NGI (0.64), the absolute values of the correlation coefficients for other vegetation indexes were all greater than 0.71. Notably, the absolute values of the correlation coefficients for NDVI, TVI, SAVI, NDRE, and MCARI were all above 0.80, with MCARI showing the strongest correlation at −0.91. Among the 20 vegetation indexes, 14 had correlation coefficients above 0.7, suggesting that different vegetation indexes have distinct mechanisms and sensitivities in reflecting the chlorophyll content of vegetation leaves, and the combination of multiple bands can more accurately reflect the physiological state of vegetation.
The magnitude of the correlation coefficients between texture features and cotton SPAD is shown in Table 4. Of the 32 textures calculated based on the five multispectral bands, most of the texture features showed a high correlation with cotton SPAD, and there were 19 texture features with absolute values of the correlation coefficients exceeding 0.8. The absolute values of the correlation coefficients between eight texture features in the NIR band and cotton SPAD were all greater than 0.69, and the NIR-ent correlation was the strongest, with a correlation coefficient as high as 0.91, followed by NIR-sec at −0.90. The absolute values of the correlation coefficients of the texture features in the rest of the channels were all lower than 0.6, and the correlation between the G-mea and the G-cor in the green channel was the poorest, with 0.17 in both cases.
Recursive feature elimination (RFE) is a feature selection method, which can select the best feature by gradually building a model and repeatedly eliminating the least important features [40]. By using recursive feature elimination, a visible-light vegetation index, multispectral vegetation index, and texture feature screening are carried out, and the best five features are screened, respectively. After the screening, the RGB vegetation index was VDVI, RGBVI, RBRI, NGBDI, and NPCI, the MS vegetation index was NDVI, GNDVI, NDRE, GCI, and NGI, and the texture features were B-sec, G-ent, G-sec, RE-ent, and RE-sec. Figure 6 shows the correlation coefficients between the input features and cotton SPAD, including CC and CH.

3.3. Analysis of Cotton SPAD Estimation Results

To evaluate the influence of different types of features on the effect of the cotton SPAD estimation model, the RFR, GBR, SVR, and VR algorithms were used to estimate cotton SPAD using the seven feature types in Table 5, respectively, and the structure of the validation set estimation based on the different types of features is shown in Table 6 and Figure 7.
As can be seen from Table 6 and Figure 7, when only single-source features are utilized for estimation, the RFR, GBR, and VR models have the highest accuracy when RGB features are used as inputs, with an R2 of 0.857, 0.869, and 0.871, RMSE of 1.938, 1.895, and 1.861, and RPD of 2.79, 2.67, and 2.79, respectively; the SVR model has the highest accuracy when the CH features are used as inputs, with an R2 of 0.793, RMSE of 2.336, and RPD of 2.20. When using feature fusion for estimation, the RFR, GBR, SVR, and VR models have the highest accuracy when MS + RGB + CH features are used as inputs, with an R2 of 0.906, 0.888, 0.865, and 0.916, RMSE of 1.553, 1.711, 1.886, and 1.481, and RPD was 3.29, 3.01, 2.72, and 3.53, respectively.
Figure 8 shows the scatterplot of the cotton SPAD estimation model with RGB features and MS + RGB + CH features as the input, respectively. As can be seen from the figure, the fitted distribution of the scatterplot based on the fusion of the three feature types as inputs to the model is closer to the 1:1 relationship, indicating that the model with this feature as an input has better predictive ability. Therefore, the use of a multiple data source fusion method can effectively overcome the limitations of a single data source and improve the stability and accuracy of the model. When the measured SPAD range was 47–53, all four models had the phenomenon of low-value overestimation, which might be since the mean value of the samples varied during different fertility periods, which made it difficult for the models to accurately capture the characteristics of the overall data distribution, thus producing bias in prediction, and even though there were some samples within the specified range, the chlorophyll content varied during the fertility periods, and the model fit to the data of different fertility periods was inconsistent.
To evaluate the cotton SPAD estimation ability of different models, the statistical results of the cotton SPAD estimation accuracy of RFR, GBR, SVR, and VR with seven characteristic variables as inputs are shown in Figure 9. As can be seen from the figure, the RFR model has a similar estimation accuracy to the GBR model and is higher than that of the SVR estimation accuracy. Compared with the RFR, GBR, and SVR models, the VR model had the highest R2, RPD, and lowest RMSE mean values, which were 0.88, 3.03, and 1.72, respectively. It shows that the integrated VR model can accurately predict the target variables and provide more reliable estimation results with fewer prediction errors in different samples.

3.4. Spatial Distribution of Cotton SPAD Based on the Optimal Estimation Model

The results of the VR model were applied to each fertility stage, and the spatial distribution of cotton SPAD in three fertility stages was obtained, as shown in Figure 10. It can be seen from the figure that cotton SPAD showed a sudden increase from the bud stage to the early flowering stage, and a slow increase from the early flowering stage to the full flowering stage, which was consistent with the changes in the actual cotton SPAD measurement results (Figure 4).
The SPAD of cotton at the bud stage ranged from 38.9 to 49.45, that of cotton at the early flowering stage ranged from 45.37 to 56.92, that of cotton at the full flowering stage ranged from 47.78 to 58.24, and the estimated and measured values of SPAD in each plot were similar. Under unified field management, different cotton–soybean intercropping ratios resulted in a different SPAD of cotton. The first three planting patterns of cotton showed a higher SPAD at the bud stage, which was 4:6 > 2:3 > 4:4. At the early flowering stage, cotton showed a higher SPAD, and the top three planting patterns were 2:3 > 4:6 > 4:4. At the full flowering stage, cotton showed a higher SPAD. The first three planting patterns were 4:4 > 4:6 > 2:3. In the comprehensive analysis, the top three planting patterns of cotton with a higher SPAD were ranked as 4:6 > 2:3 > 4:4, indicating that cotton grew best when the intercropping pattern was 4:6.

4. Discussion

In this paper, the visible and multispectral data of cotton in intercropping patterns acquired by UAV were feature extracted, and the extracted features were correlated with cotton SPAD. The vegetation index MCARI has the strongest correlation with cotton SPAD, with a correlation coefficient of −0.91. MCARI is specifically designed to enhance the characteristics of chlorophyll absorption and can effectively reflect changes in the chlorophyll content by eliminating interference from the vegetation structure and soil background, which is similar to Daughtry et al. [41]. On the contrary, the vegetation index VARI has the lowest correlation with cotton SPAD (correlation coefficient is 0.25), because VARI has too many complex structures and parameters [42], which may limit its performance in cotton SPAD monitoring. Moreover, the high sensitivity of the VARI index to soil background may further affect its correlation with the SPAD value in the intercropping mode, because soil exposure and coverage vary greatly in intercropping systems, resulting in VARI reflecting more soil background signals than chlorophyll information.
Among the four cotton SPAD estimation models (RFR, GBR, SVR, and VR), the VR model had the highest estimation accuracy, which used soft voting to combine the advantages of random forest, gradient boosted tree, and support vector machine models, effectively utilizing the prediction ability of multiple models. The correlation coefficient between the cotton plant height and SPAD was 0.85, and combining the plant height with other data features for cotton SPAD estimation could further improve the model estimation accuracy, and the R2, RMSE, and RPD of the optimal VR model were 0.916, 1.481, and 3.53, respectively, which was attributed to the fact that the plant height, as an important indicator of plant growth, could provide additional growth information, which could increase the diversity of data [21]. Therefore, cotton SPAD estimation should not be limited to a single source of features extracted from remote sensing images, but also consider other features that have an impact on SPAD.
Changes in cotton SPAD values are closely related to the physiological development of cotton plants at different reproductive stages. At the bud stage, the plant leaves have not yet fully expanded, and the chlorophyll synthesis rate is relatively low; as the cotton plant enters the growth stage, the leaf surface area increases, and the chlorophyll synthesis rate in the leaf accelerates, resulting in a significant increase in SPAD. When cotton enters the full flowering stage, most of the cotton leaves have grown to a certain extent and the leaf area no longer increases, but the photosynthetic intensity continues to increase, the temperature and humidity are suitable, and the plant’s ability to enhance the absorption and utilization of nutrients further promote the chlorophyll synthesis, so that the SPAD showed a slow upward trend.
Cotton and grain intercropping is one of the important cropping patterns to increase ecological diversity as well as to enhance crop productivity. In this paper, we statistically determined that the mean SPAD values of cotton under different intercropping patterns in the whole life cycle showed a relatively stable level, with SPAD fluctuating between 50.52 and 51.39. Analyzing the SPAD of cotton under different intercropping patterns of cotton and bean, the top three planting patterns with better cotton growth in the three growth stages were the 4:6 pattern, 2:3 pattern, and 4:4 pattern, indicating that different intercropping patterns of cotton and bean have different impacts on the SPAD of cotton. Zeng et al. also showed that the 4:6 model performed best in cotton growth monitoring [43], possibly because this ratio provided more ideal conditions for light and resource competition, and soybean, as a leguminous crop, could improve the soil nitrogen content through nitrogen fixation and reduce nutrient competition between cotton and soybean.

5. Conclusions

Based on the visible-light data and multispectral data of unmanned aerial vehicles, this paper integrates the visible-light data features, multispectral data features, and cotton plant height, uses the voting regression ensemble algorithm (VR) to estimate cotton SPAD, and analyzes the influence of different intercropping patterns on the growth of cotton. In the correlation analysis, the MS vegetation index with the highest correlation coefficient with cotton SPAD was MCARI (−0.91), the RGB vegetation indices were VARI (−0.87) and NPCI (−0.87), and the texture feature was NIR-ent (0.91); the correlation coefficient of the cotton plant height with SPAD was 0.85. Compared with the models of RFR, GBR, and SVR, the VR model showed the best estimation effect in estimating cotton SPAD. Based on the VR estimation model, the model with CH + MS + RGB data fusion as the input had the highest accuracy with an R2, RMSE, and RPD of 0.916, 1.481, and 3.53, respectively. The best growth of cotton was observed when the cotton bean intercropping pattern was 4:6. The combination of multi-source data fusion and voting regression ensemble learning in this paper provides a new effective method for cotton SPAD estimation.

Author Contributions

Conceptualization, X.W. and J.Z. (Jing Zhao); methodology, X.W., J.Z. (Jing Zhao) and L.Y.; software, X.W., J.L., L.Y. and W.C.; validation, X.W., J.L., G.H. and Q.Z.; formal analysis, J.Z. (Jing Zhao) and Y.L.; investigation, X.W., J.L., J.Z. (Junqiang Zhang), L.Y., W.C., X.H., D.Q., G.H., Q.Z. and Z.W.; resources, J.Z. (Junqiang Zhang), X.H., D.Q. and Z.W.; data curation, X.W.; writing—original draft preparation, X.W.; writing—review and editing, X.W. and J.Z. (Jing Zhao); visualization, X.W., J.L., J.Z. (Jing Zhao) and Y.L.; supervision, J.Z. (Jing Zhao) and Y.L.; project administration, J.Z. (Jing Zhao) and L.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Natural Science Foundation Project of Shandong Province (ZR2021MD091), Qingdao Industrial Experts Program, National Key R&D Program of China(2023YFD2000200), and Development of Intelligent Seedling Release Machine for Cotton Plant under Membrane with Electrothermal Melt Film.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

We acknowledge the support provided by the members of the Shandong University of Technology Smart Agriculture team.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Lv, Q.Q.; Chi, B.J.; He, N.; Zhang, D.M.; Dai, J.L.; Zhang, Y.J.; Dong, H.Z. Cotton-Based Rotation, Intercropping, and Alternate Intercropping Increase Yields by Improving Root-Shoot Relations. Agronomy 2023, 13, 413. [Google Scholar] [CrossRef]
  2. Li, C.J.; Hoffland, E.; Kuyper, T.W.; Yu, Y.; Zhang, C.C.; Li, H.G.; Zhang, F.S.; van der Werf, W. Syndromes of production in intercropping impact yield gains. Nat. Plants 2020, 6, 653–660. [Google Scholar] [CrossRef] [PubMed]
  3. Zou, X.X.; Shi, P.X.; Zhang, C.J.; Si, T.; Wang, Y.F.; Zhang, X.J.; Yu, X.N.; Wang, H.X.; Wang, M.L. Rotational strip intercropping of maize and peanuts has multiple benefits for agricultural production in the northern agropastoral ecotone region of China. Eur. J. Agron. 2021, 129, 126304. [Google Scholar] [CrossRef]
  4. Xu, A.l.; Wu, X.; Ma, D.G.; Wu, X.F.; Shen, H.G.; Xie, G.G.; Zhang, L.S.; Li, Y.E. Influence of Cotton Wilt Syndrome on Cotton Growth, Development, Yield and Quality. Chin. Agric. Sci. Bull. 2013, 29, 119–124. [Google Scholar]
  5. Yin, X.; Hou, Z.; Ye, J.; Min, W.; Liu, K.; Wang, F.B.; Liao, H.; Gan, H.T.; Liu, S.H.; Sun, J.L. Application of polyphenol-chlorophyll meter to monitor cotton N nutrition status. J. Plant Nutr. Fert. 2021, 27, 1198–1212. [Google Scholar]
  6. Liu, Y.B.; Pei, J.; Fang, H.J.; Liu, P.Y.; Liu, S.Y.; Zou, Y.P. Optimizing spatial window selection for rice SPAD value retrieval using multispectral UAV images. Trans. Chin. Soc. Agric. Eng. 2023, 39, 165–174. [Google Scholar]
  7. Naveed Tahir, M.; Zaigham Abbas Naqvi, S.; Lan, Y.B.; Zhang, Y.L.; Wang, Y.K.; Afzal, M.; Jehanzeb Masud Cheema, M.; Amir, S. Real time estimation of chlorophyll content based on vegetation indices derived from multispectral UAV in the kinnow orchard. Int. J. Precis. Agric. Aviat. 2018, 1, 24–31. [Google Scholar] [CrossRef]
  8. Pan, F.J.; Li, W.H.; Lan, Y.B.; Liu, X.G.; Miao, J.C.; Xiao, X.; Xu, H.Y.; Lu, L.Q.; Zhao, J. SPAD inversion of summer maize combined with multi-source remote sensing data. Int. J. Precis. Agric. Aviat. 2018, 1, 45–52. [Google Scholar] [CrossRef]
  9. Xu, H.Y.; Lan, Y.B.; Zhang, S.L.; Tian, B.Q.; Yu, H.L.; Wang, X.L.; Zhao, S.L.; Wang, Z.S.; Yang, D.J.; Zhao, J. Research on vegetation cover extraction method of summer maize based on UAV visible light image. Int. J. Precis. Agric. Aviat. 2018, 1, 44–51. [Google Scholar] [CrossRef]
  10. Yan, C.C.; Qu, Y.Y.; Chen, X.G.; Wu, H.Q.; Zhang, B.; Peng, H.B.; Chen, Q. Estimation of cotton SPAD value and leaf water content based on UAV multispectral images. Trans. Chin. Soc. Agric. Eng. 2023, 39, 61–67. [Google Scholar]
  11. He, Y.; Deng, L.; Mao, Z.W.; Sun, J. Remote Sensing Estimation of Canopy SPAD Value for Maize Based on Digital Camera. Sci. Agric. Sin. 2018, 51, 66–77. [Google Scholar]
  12. Yin, Q.; Zhang, Y.T.; Li, W.L.; Wang, J.J.; Wang, W.L.; Ahmad, I.; Zhou, G.S.; Huo, Z.Y. Better Inversion of Wheat Canopy SPAD Values before Heading Stage Using Spectral and Texture Indices Based on UAV Multispectral Imagery. Remote Sens. 2023, 15, 4935. [Google Scholar] [CrossRef]
  13. Meng, C.C.; Zhao, J.; Lan, Y.B.; Yan, C.Y.; Yang, D.J.; Wen, Y.T. SPAD Inversion Model of Corn Canopy Based on UAV Visible Light Image. Trans. Chin. Soc. Agric. Mach. 2020, 51, 366–374. [Google Scholar]
  14. Chen, R.Q.; Zhang, C.J.; Xu, B.; Zhu, Y.H.; Zhao, F.; Han, S.Y.; Yang, G.J.; Yang, H. Predicting individual apple tree yield using UAV multi-source remote sensing data and ensemble learning. Comput. Electron. Agric. 2022, 201, 107275. [Google Scholar] [CrossRef]
  15. Yu, D.Y.; Zha, Y.Y.; Sun, Z.G.; Li, J.; Jin, X.L.; Zhu, W.X.; Bian, J.; Ma, L.; Zeng, Y.J.; Su, Z.B. Deep convolutional neural networks for estimating maize above-ground biomass using multi-source UAV images: A comparison with traditional machine learning algorithms. Precis. Agric. 2023, 24, 92–113. [Google Scholar] [CrossRef]
  16. Niu, Q.L.; Feng, H.K.; Zhou, X.G.; Zhu, J.Q.; Yong, B.B.; Li, H.Z. Combining UAV Visible Light and Multispectral Vegetation Indices for Estimating SPAD Value of Winter Wheat. Trans. Chin. Soc. Agric. Mach. 2021, 52, 183–194. [Google Scholar]
  17. Ding, F.; Li, C.C.; Zhai, W.G.; Fei, S.P.; Cheng, Q.; Chen, Z. Estimation of Nitrogen Content in Winter Wheat Based on Multi-Source Data Fusion and Machine Learning. Agriculture 2022, 12, 1752. [Google Scholar] [CrossRef]
  18. Fei, S.P.; Hassan, M.A.; Xiao, Y.G.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.Y.; Chen, R.Q.; Ma, Y.T. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
  19. Zhai, W.G.; Li, C.C.; Cheng, Q.; Ding, F.; Chen, Z. Exploring Multisource Feature Fusion and Stacking Ensemble Learning for Accurate Estimation of Maize Chlorophyll Content Using Unmanned Aerial Vehicle Remote Sensing. Remote Sens. 2023, 15, 3454. [Google Scholar] [CrossRef]
  20. Zhai, W.G.; Li, C.C.; Fei, S.P.; Liu, Y.H.; Ding, F.; Cheng, Q.; Chen, Z. CatBoost algorithm for estimating maize above-ground biomass using unmanned aerial vehicle-based multi-source sensor data and SPAD values. Comput. Electron. Agric. 2023, 214, 108306. [Google Scholar] [CrossRef]
  21. Zou, M.X.; Liu, Y.; Fu, M.D.; Li, C.J.; Zhou, Z.X.; Meng, H.R.; Xing, E.G.; Ren, Y.M. Combining spectral and texture feature of UAV image with plant height to improve LAI estimation of winter wheat at jointing stage. Front. Plant Sci. 2024, 14, 1272049. [Google Scholar] [CrossRef] [PubMed]
  22. Oteng-Frimpong, R.; Karikari, B.; Sie, E.K.; Kassim, Y.B.; Puozaa, D.K.; Rasheed, M.A.; Fonceka, D.; Okello, D.K.; Balota, M.; Burow, M.; et al. Multi-locus genome-wide association studies reveal genomic regions and putative candidate genes associated with leaf spot diseases in African groundnut (Arachis hypogaea L.) germplasm. Front. Plant Sci. 2023, 13, 1076744. [Google Scholar] [CrossRef] [PubMed]
  23. Yang, Q.; Shi, L.S.; Han, J.Y.; Zha, Y.Y.; Zhu, P.H. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
  24. Wang, M.X.; Jin, J.Z.; Zhou, E.M.; Luo, L.J. Supercritical CO2 Extraction of Polyphenol form Jackfruit Seed. Nat. Prod. Res. Dev. 2018, 30, 1444–1448. [Google Scholar] [CrossRef]
  25. Killeen, P.; Kiringa, I.; Yeap, T.; Branco, P. Corn Grain Yield Prediction Using UAV-Based High Spatiotemporal Resolution Imagery, Machine Learning, and Spatial Cross-Validation. Remote Sens. 2024, 16, 683. [Google Scholar] [CrossRef]
  26. Li, Z.H.; Li, Z.H.; Fairbairn, D.; Li, N.; Xu, B.; Feng, H.K.; Yang, G.J. Multi-LUTs method for canopy nitrogen density estimation in winter wheat by field and UAV hyperspectral. Comput. Electron. Agric. 2019, 162, 174–182. [Google Scholar] [CrossRef]
  27. Su, W.; Wang, W.; Liu, Z.; Zhang, M.Z.; Bian, D.H.; Cui, Y.H.; Huang, J.X. Determining the retrieving parameters of corn canopy LAI and chlorophyll content computed using UAV image. Trans. Chin. Soc. Agric. Eng. 2020, 36, 58–65. [Google Scholar]
  28. Shen, Y.Y.; Yan, Z.Y.; Yang, Y.J.; Tang, W.; Sun, J.Q.; Zhang, Y.C. Application of UAV-Borne Visible-Infared Pushbroom Imaging Hyperspectral for Rice Yield Estimation Using Feature Selection Regression Methods. Sustainability 2024, 16, 632. [Google Scholar] [CrossRef]
  29. Ma, J.W.; Chen, P.F.; Sun, Y.; Gu, J.; Wang, L.J. Comparing different machine learning methods for maize leaf area index (LAI) prediction using multispectral image from unmanned aerial vehicle (UAV). Acta Agron. Sin. 2023, 49, 3364–3376. [Google Scholar]
  30. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  31. Zhang, D.Y.; Han, X.X.; Lin, F.F.; Du, S.Z.; Zhang, G.; Hong, Q. Estimation of winter wheat leaf area index using multi-source UAV image feature fusion. Trans. Chin. Soc. Agric. Eng. 2022, 38, 171–179. [Google Scholar]
  32. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  33. Li, S.Y.; Yuan, F.; Ata-Ui-Karim, S.T.; Zheng, H.B.; Cheng, T.; Liu, X.J.; Tian, Y.C.; Zhu, Y.; Cao, W.X.; Cao, Q. Combining Color Indices and Textures of UAV-Based Digital Imagery for Rice LAI Estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef]
  34. Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
  35. Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef]
  36. Dai, J.G.; Xue, J.L.; Zhao, Q.Z.; Wang, Q.; Chen, B.; Zhang, G.S.; Jiang, N. Extraction of cotton seedling growth information using UAV visible light remote sensing images. Trans. Chin. Soc. Agric. Eng. 2020, 36, 63–71. [Google Scholar]
  37. Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric. 2011, 75, 75–83. [Google Scholar] [CrossRef]
  38. Clay, D.E.; Kim, K.I.; Chang, J.; Clay, S.A.; Dalsted, K. Characterizing water and nitrogen stress in corn using remote sensing. Agron. J. 2006, 98, 579–587. [Google Scholar] [CrossRef]
  39. Ju, W.; Lu, C.H.; Zhang, Y.J.; Chen, X.J.; Jiang, W.W. Research on Quantitative Regression Method of IR Spectra of Organic Compounds Based on Ensemble Learning with Wavelength Selection. Spectrosc. Spectr. Anal. 2023, 43, 239–247. [Google Scholar]
  40. Xiang, S.Y.; Xu, Z.H.; Zhang, Y.W.; Zhang, Q.; Zhou, X.; Yu, H.; Li, B.; Li, Y.F. Construction and Application of ReliefF-RFE Feature Selection Algorithm for Hyperspectral Image Classification. Spectrosc. Spectr. Anal. 2022, 42, 3283–3290. [Google Scholar]
  41. Daughtry, C.S.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey Iii, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  42. Ao, D.; Yang, J.H.; Ding, W.T.; An, S.S.; He, H.L. Review of 54 Vegetation Indices. J. Anhui Agric. Sci. 2023, 51, 13–21+28. [Google Scholar]
  43. Zeng, H.M.; Liu, A.T. Summer sowing cotton and soybean 4:6 intercropping high-yield cultivation technology. Prim. Agric. Technol. Ext. 2021, 9, 69–71. [Google Scholar]
Figure 1. Study area diagram.
Figure 1. Study area diagram.
Agronomy 14 02245 g001
Figure 2. Visible image soil background removal. (a) Visible raw image; (b) image after removal of soil background.
Figure 2. Visible image soil background removal. (a) Visible raw image; (b) image after removal of soil background.
Agronomy 14 02245 g002
Figure 3. Schematic diagram of voting regression integration.
Figure 3. Schematic diagram of voting regression integration.
Agronomy 14 02245 g003
Figure 4. Cotton growth in different intercropping ratio patterns. (a) Cotton SPAD in different intercropping ratio patterns; (b) cotton plant height in different intercropping ratio patterns.
Figure 4. Cotton growth in different intercropping ratio patterns. (a) Cotton SPAD in different intercropping ratio patterns; (b) cotton plant height in different intercropping ratio patterns.
Agronomy 14 02245 g004
Figure 5. Scatter plot between SPAD and CH.
Figure 5. Scatter plot between SPAD and CH.
Agronomy 14 02245 g005
Figure 6. Correlation coefficients between input features and SPAD of cotton.
Figure 6. Correlation coefficients between input features and SPAD of cotton.
Agronomy 14 02245 g006
Figure 7. Accuracy of cotton SPAD estimation with different feature types and different models. (a) R2; (b) RMSE; (c) RPD.
Figure 7. Accuracy of cotton SPAD estimation with different feature types and different models. (a) R2; (b) RMSE; (c) RPD.
Agronomy 14 02245 g007
Figure 8. Scatterplot of the estimation model with RGB features and with MS + RGB + CH features as input. (a) Scatterplot of RFR model based on RGB; (b) scatterplot of GBR model based on RGB; (c) scatterplot of SVR model based on RGB; (d) scatterplot of VR model based on RGB; (e) scatterplot of RFR model based on MS + RGB + CH; (f) scatterplot of GBR model based on MS + RGB + CH; (g) scatterplot of SVR model based on MS + RGB + CH; (h) MS + RGB + CH based VR model scatterplot.
Figure 8. Scatterplot of the estimation model with RGB features and with MS + RGB + CH features as input. (a) Scatterplot of RFR model based on RGB; (b) scatterplot of GBR model based on RGB; (c) scatterplot of SVR model based on RGB; (d) scatterplot of VR model based on RGB; (e) scatterplot of RFR model based on MS + RGB + CH; (f) scatterplot of GBR model based on MS + RGB + CH; (g) scatterplot of SVR model based on MS + RGB + CH; (h) MS + RGB + CH based VR model scatterplot.
Agronomy 14 02245 g008aAgronomy 14 02245 g008b
Figure 9. Cotton SPAD estimation accuracy statistics for RFR, GBR, SVR, and VR. (a) R2; (b) RMSE; (c) RPD.
Figure 9. Cotton SPAD estimation accuracy statistics for RFR, GBR, SVR, and VR. (a) R2; (b) RMSE; (c) RPD.
Agronomy 14 02245 g009
Figure 10. Spatial distribution of SPAD in cotton at different fertility stages. (a) Bud stage; (b) early flowering stage; (c) full flowering stage.
Figure 10. Spatial distribution of SPAD in cotton at different fertility stages. (a) Bud stage; (b) early flowering stage; (c) full flowering stage.
Agronomy 14 02245 g010
Table 1. Cotton SPAD and plant height sampling results for the whole fertility period.
Table 1. Cotton SPAD and plant height sampling results for the whole fertility period.
DataStageSample SizeMaximum ValueMinimum ValueMean ValueStandard DeviationCoefficient of Variation
SPAD valueBud stage5048.1339.944.411.84.1%
Early flowering stage5057.845.7253.822.24.1%
Full flowering stage5057.9249.754.821.62.9%
Whole-growth stage15057.9239.951.025.110%
CH valueBud stage5098.376.788.466.247%
Early flowering stage50138.4102.8121.797.576.2%
Full flowering stage50144.3106.1130.247.245.4%
Whole-growth stage150144.376.7113.5119.3517%
Table 2. The formula for calculating vegetation indices related to SPAD.
Table 2. The formula for calculating vegetation indices related to SPAD.
Sensor TypeVIsNameFormulation
MSNDVI [22]Normalized difference vegetation index N I R R / ( N I R + R )
GNDVI [23]Green-normalized difference vegetation index N I R G / ( N I R + G )
TVI [24]Triangular vegetation index 0.5 120 N I R G 200 R G
SAVI [25]Soil-adjusted vegetation index 1.5 ( N I R R ) / ( N I R R + 0.5 )
NDRE [12]Normalized difference red-edge index ( N I R R E ) / ( N I R + R E )
GDVI [16]Green-difference vegetation index (g) N I R G
MCARI [26]Modified chlorophyll absorption ratio index [ R E R 0.2 ( R E G ) ] R E / R
GCI [27]Green coverage index N I R / G 1
EVI [28]Enhanced vegetation index 2.5 ( N I R R ) / ( N I R + 6 R 7.5 B + 1 )
NGI [29]Normalized green index G / ( N I R + G + R E )
RGBWI [30]Woebbecke index ( G B ) / ( R G )
VDVI [31]Visible-band difference vegetation index ( 2 G R B ) / ( 2 G + R + B )
VARI [32]Visible atmospherically resistant index ( G R ) / ( G + R B )
RGBVI [33]Red–green–blue vegetation index ( G 2 B × R ) / ( G 2 + B × R )
RBRI [13]Blue–red ratio index R / B
NGBDI [34]Normalized green–blue difference index ( G B ) / ( G + B )
RBDI [35]Red–blue difference index R B
GBDI [36]Green–blue difference index G B
ExB [37]Excess blue index ( 1.4 B G )
NPCI [38]Chlorophyll normalized vegetation index ( R B ) / ( R + B )
Note: NIR, R, G, B, and RE are the reflectance of the near-infrared band, red light band, green light band, blue band, and red edge band, respectively.
Table 3. Correlation coefficients between vegetation index and cotton SPAD.
Table 3. Correlation coefficients between vegetation index and cotton SPAD.
RGB-VIsrMS-VIsr
WI0.78 *NDVI−0.87 *
VDVI−0.60 *GNDVI−0.74 *
VARI0.25TVI−0.89 *
RGBVI−0.64 *SAVI−0.86 *
RBRI−0.87 *NDRE−0.80 *
RBDI−0.78 *GDVI−0.75 *
NGBDI−0.80 *MCARI−0.91 *
GBDI−0.34 *GCI−0.73 *
EXB0.66 *EVI0.71 *
NPCI−0.87 *NGI0.64 *
Note: * indicates a significant correlation at the 0.01 level, same table below.
Table 4. Correlation coefficients between multispectral texture features and cotton SPAD.
Table 4. Correlation coefficients between multispectral texture features and cotton SPAD.
B-TFrG-TFrR-TFrRE-TFrNIR-TFr
B-mea0.74 *G-mea0.17R-mea0.86 *RE-mea0.40 *NIR-mea0.87 *
B-var0.68 *G-var0.74 *R-var0.75 *RE-var0.77 *NIR-var0.75 *
B-hom−0.82 *G-hom−0.84 *R-hom−0.86 *RE-hom−0.87 *NIR-hom−0.89 *
B-con0.68 *G-con0.75 *R-con0.75 *RE-con0.78 *NIR-con0.76 *
B-dis0.75 *G-dis0.80 *R-dis0.81 *RE-dis0.83 *NIR-dis0.84 *
B-ent0.79 *G-ent0.80 *R-ent0.83 *RE-ent0.86 *NIR-ent0.91 *
B-sec−0.79 *G-sec−0.80 *R-sec−0.81 *RE-sec−0.85 *NIR-sec−0.90 *
B-cor0.43 *G-cor0.17R-cor0.42 *RE-cor0.25 *NIR-cor0.69 *
Note: B-mea denotes the mean texture feature extracted from the blue channel of the multispectral image, and the rest is the same. Note: * indicates a significant correlation at the 0.01 level.
Table 5. Model input characteristics.
Table 5. Model input characteristics.
Feature TypeFeature Set
MSNDVI, GNDVI, NDRE, GCI, NGI, B-sec, G-ent, G-sec, RE-ent, RE-sec
RGBVDVI, RGBVI, RBRI, NGBDI, NPCI, CC
CHCH
MS + CHNDVI, GNDVI, NDRE, GCI, NGI, B-sec, G-ent, G-sec, RE-ent, RE-sec, CH
RGB + CHVDVI, RGBVI, RBRI, NGBDI, NPCI, CC, CH
MS + RGBNDVI, GNDVI, NDRE, GCI, NGI, B-sec, G-ent, G-sec, RE-ent, RE-sec, VDVI, RGBVI, RBRI, NGBDI, NPCI, CC
MS + RGB + CHNDVI, GNDVI, NDRE, GCI, NGI, B-sec, G-ent, G-sec, RE-ent, RE-sec, VDVI, RGBVI, RBRI, NGBDI, NPCI, CC, CH
Table 6. Cotton SPAD estimation with different feature types and different algorithms.
Table 6. Cotton SPAD estimation with different feature types and different algorithms.
ModelInput FeaturesEvaluation IndexesModelInput FeaturesEvaluation Indexes
R2RMSERPDR2RMSERPD
RFRMS0.8112.2282.32SVRMS0.7772.4212.14
RGB0.8571.9382.67RGB0.7382.621.96
CH0.7712.4161.94CH0.7932.3362.20
MS + CH0.8481.9992.59MS + CH0.8022.2782.27
RGB + CH0.9011.5743.24RGB + CH0.8092.2422.31
MS + RGB0.8721.8332.83MS + RGB0.8252.1472.41
MS + RGB + CH0.9061.5533.29MS + RGB + CH0.8651.8862.72
GBRMS0.7402.6181.98VRMS0.8332.0602.43
RGB0.8691.8952.71RGB0.8711.8612.79
CH0.7912.342.19CH0.8611.9112.71
MS + CH0.8302.1152.45MS + CH0.9031.6123.21
RGB + CH0.8821.7562.95RGB + CH0.9121.5193.34
MS + RGB0.8791.7812.91MS + RGB0.9021.5993.24
MS + RGB + CH0.8881.7113.01MS + RGB + CH0.9161.4813.53
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, X.; Li, J.; Zhang, J.; Yang, L.; Cui, W.; Han, X.; Qin, D.; Han, G.; Zhou, Q.; Wang, Z.; et al. Estimation of Cotton SPAD Based on Multi-Source Feature Fusion and Voting Regression Ensemble Learning in Intercropping Pattern of Cotton and Soybean. Agronomy 2024, 14, 2245. https://doi.org/10.3390/agronomy14102245

AMA Style

Wang X, Li J, Zhang J, Yang L, Cui W, Han X, Qin D, Han G, Zhou Q, Wang Z, et al. Estimation of Cotton SPAD Based on Multi-Source Feature Fusion and Voting Regression Ensemble Learning in Intercropping Pattern of Cotton and Soybean. Agronomy. 2024; 14(10):2245. https://doi.org/10.3390/agronomy14102245

Chicago/Turabian Style

Wang, Xiaoli, Jingqian Li, Junqiang Zhang, Lei Yang, Wenhao Cui, Xiaowei Han, Dulin Qin, Guotao Han, Qi Zhou, Zesheng Wang, and et al. 2024. "Estimation of Cotton SPAD Based on Multi-Source Feature Fusion and Voting Regression Ensemble Learning in Intercropping Pattern of Cotton and Soybean" Agronomy 14, no. 10: 2245. https://doi.org/10.3390/agronomy14102245

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop