Next Article in Journal
The Use of Low-Cost Drone and Multi-Trait Analysis to Identify High Nitrogen Use Lines for Wheat Improvement
Previous Article in Journal
Study on the Appropriate Degree of Water-Saving Measures in Arid Irrigated Areas Considering Groundwater Level
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Rapeseed Above-Ground Biomass Estimation Based on Spectral and LiDAR Data

1
Jiangsu Key Laboratory of Crop Genetics and Physiology/Jiangsu Key Laboratory of Crop Cultivation and Physiology, Agricultural College of Yangzhou University, Yangzhou 225009, China
2
Department of Clinical Medicine, Jiangsu Health Vocational College, Nanjing 211800, China
3
Precision Agriculture Lab, School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2024, 14(8), 1610; https://doi.org/10.3390/agronomy14081610
Submission received: 7 June 2024 / Revised: 3 July 2024 / Accepted: 22 July 2024 / Published: 23 July 2024
(This article belongs to the Special Issue Remote Sensing Applications in Crop Monitoring and Modelling)

Abstract

:
The study of estimating rapeseed above-ground biomass (AGB) is of significant importance, as it can reflect the growth status of crops, enhance the commercial value of crops, promote the development of modern agriculture, and predict yield. Previous studies have mostly estimated crop AGB by extracting spectral indices from spectral images. This study aims to construct a model for estimating rapeseed AGB by combining spectral and LiDAR data. This study incorporates LiDAR data into the spectral data to construct a regression model. Models are separately constructed for the overall rapeseed varieties, nitrogen application, and planting density to find the optimal method for estimating rapeseed AGB. The results show that the R² for all samples in the study reached above 0.56, with the highest overall R² being 0.69. The highest R² for QY01 and ZY03 varieties was 0.56 and 0.78, respectively. Under high- and low-nitrogen conditions, the highest R² was 0.64 and 0.67, respectively. At a planting density of 36,000 plants per mu, the highest R² was 0.81. This study has improved the accuracy of estimating rapeseed AGB.

1. Introduction

Rapeseed (Brassica napus subsp. napus) is a significant source of edible oil and protein-rich livestock feed [1]. Over the past five years, the global average rapeseed yield has been approximately 2.1 tons per hectare (hm2). The biomass and quality of rapeseed vary due to location, crop variety, and their interactions [2]. Above-ground biomass (AGB) is closely related to crop nutritional status, making it a valuable indicator of crop growth conditions [3]. In the research conducted by M. Corti, it was found that the trend of nitrogen content in the AGB of cover crops followed that of AGB. Indeed, treatments characterized by high nitrogen content had a high AGB (e.g., white mustard) and a high nitrogen concentration (e.g., hairy vetch) [4]. In a separate study by Qiu, the impact of nutrients such as nitrogen, phosphorus, and potassium on biomass was investigated. The findings indicated that crops with better nutritional status tend to produce more AGB [5]. Understanding the spatiotemporal dynamics of AGB is crucial for formulating and implementing site-specific crop management strategies. Timely AGB monitoring is an essential component of precision agriculture.
AGB is typically measured through manual sampling, a process that can be time-consuming and labor-intensive [6]. This method relies heavily on subjective, often inaccurate, and labor-intensive ground survey methods [7]. Remote-sensing methods from orbital and sub-orbital platforms have gained prominence as powerful tools in estimating AGB by observing the crops’ and environment’s physical, chemical, or biological properties, such as temperature, humidity, vegetation type, or land use [3]. Unmanned aerial vehicles (UAVs) are increasingly prominent due to their high flexibility, ease of operation, high spatial resolution, and on-demand data acquisition capabilities. Consequently, UAVs provide a novel technological approach for rapidly and non-destructively extracting field crop phenotypic information [8].
UAVs equipped with multispectral cameras have been proven to be a mature method for predicting crop biomass. Their advantages include multispectral imaging, high spatial resolution, and rapid data acquisition [9]. Fu et al. utilized a multi-rotor UAV equipped with a multispectral camera to collect canopy spectral data during various wheat growth stages. They integrated multiple parametric and non-parametric modeling methods to monitor key growth indicators, including leaf area index (LAI) and leaf dry matter (LDM), and predict grain yield [10]. Zheng et al. improved rice AGB estimation by combining texture information from UAV multispectral (MS) images with spectral data. Specifically, they found that the normalized difference texture index (NDTI) based on mean textures derived from the red and green bands performed better than other texture variables and spectral indices. This research contributes valuable methods for enhancing crop growth monitoring using UAV imagery [11]. Niu et al. utilized UAV red, green, and blue (RGB) imagery to estimate plant height and vegetation indices. They compared the performance of models based solely on vegetation indices, solely on plant height, and a combination of both. The results indicated that the directly extracted plant height from UAV RGB point clouds strongly correlated with ground-truth measurements. Additionally, the vegetation indices derived from UAV RGB imagery showed significant potential for estimating corn biomass in field conditions [12]. Bendig et al. estimated fresh and dry biomass on a summer barley test site with 18 cultivars and two nitrogen (N) treatments using plant height (PH) derived from the crop surface model (CSM). The super-high resolution, multi-temporal CSM was derived from RGB images captured from a small UAV. The combination of spectral indices and plant height outperformed using a single index alone [13]. Yue et al. constructed several single-parameter models for estimating AGB based on spectral parameters. These models included specific wavelength bands, spectral indices (such as the normalized difference vegetation index (NDVI), greenness index (GI), and wide dynamic range vegetation index (WDRVI)), and crop height. Comparative analysis of experimental results demonstrated that incorporating crop height into the models improved the accuracy of AGB estimation [14].
Light detection and ranging (LiDAR) for estimating vegetation height and AGB in forestry applications has matured. Researchers have successfully demonstrated the utility and potential of repeat LiDAR data for resource monitoring and carbon management. Zhao et al. described robust techniques that are highly suitable for analyzing multi-temporal LiDAR data. They also affirmed the utility and potential of repeat LiDAR data for resource monitoring and carbon management. In crop applications, LiDAR is still in its early stages [15]. However, Jimenez-Berni et al. demonstrated the capabilities of LiDAR installed on lightweight mobile ground platforms. They conducted rapid, non-destructive canopy height, ground cover, and AGB estimations. Their findings revealed a strong relationship between canopy height and LiDAR (with an R2 of 0.99 and an RMSE of 0.0017 m), emphasizing the close correlation between LiDAR-based AGB predictions and actual AGB [16].
Currently, biomass monitoring is primarily accomplished through spectral and spectral index data. Kross et al. demonstrated the applicability of biomass estimation for two types of crops (corn and soybeans) with different canopy structures, leaf structures, and photosynthetic pathways using Rapid Eye multispectral data [17]. Wang et al. utilized drone multispectral technology to acquire large-scale plant spectral information and, based on this, compared machine-learning methods to find the optimal model for estimating the biomass of camphor trees [18]. Before conducting our experiment, we found a close relationship between biomass and plant height through preliminary experiments and observations. However, the spectral data need to adequately reflect plant height, leading to instability in biomass estimation. Therefore, we combined radar-extracted plant height with spectral data to estimate biomass, improving accuracy. The properties of the sensors limit the responses generated by different sensors, and the information obtained is limited. Using multiple sensors to acquire and integrate data is necessary, which can avoid bottlenecks caused by a lack of sufficient features [19]. Wan et al. simultaneously obtained RGB and multispectral images of rice at multiple stages, extracted plant height and vegetation index, respectively, and integrated these two types of feature information to predict rice biomass. The prediction accuracy was significantly higher than prediction models based on a single sensor [9]. Zhu et al. extracted vegetation index, texture index, elevation, and temperature information from hyperspectral, RGB, lidar, and thermal infrared images, achieving precise estimation of corn’s LAI, AGB, plant height, chlorophyll concentration, and water content [20].
Previous studies have utilized remote-sensing techniques for AGB estimation, achieving favorable results. However, uncertainties persist in estimating rapeseed AGB under complex cultivation conditions, especially when dealing with multiple varieties and nitrogen treatments, often influenced by population architecture and leaf color. LiDAR data can effectively capture canopy structure, and when combined with spectral data, they can better differentiate population variations caused by different varieties and cultivation methods. Therefore, this study aims to enhance AGB estimation accuracy and applicability by integrating UAV LiDAR and spectral image data. The research objectives include (1) selecting vegetation indices and texture indices closely related to AGB based on different spectral combinations; (2) constructing and optimizing AGB estimation models by combining LiDAR data with spectral image features, comparing model accuracy, and conducting practical validation; and (3) assessing model suitability for AGB estimation across different varieties and nitrogen levels.

2. Materials and Methods

2.1. Experimental Design

As shown in Figure 1, the study area is located in Zhenjiang City, Jiangsu Province, with a longitude of 119°18′45″ and a latitude of 32°10′59″. The altitude ranges from 2.0 m to 5.5 m, and the area has a subtropical monsoon humid climate. The test varieties in this field are the cabbage-type hybrid rapeseed varieties ZY03 and QY01. The previous crop was corn, harvested in late September, and the sowing date for the varieties was 5 October 2021. The planting is divided into ten density levels, namely 60,000, 120,000, 180,000, 240,000, 300,000, 360,000, 420,000, 480,000, 540,000, and 600,000 plants per ha. Overall, there are two nitrogen application rates of 225 kg/hm2 and 270 kg/hm2, with the nitrogen fertilizer operation being at a base fertilizer–jointing fertilizer ratio of 5:5. Each plot applies phosphorus and potassium fertilizers at 120 kg/hm2, boron fertilizer at 4.5 kg/hm2, and the plot area is set to 2.4 m × 10 m = 48 m2. At each stage, three samples (three replicates) are taken from each plot. Seedlings are established 4–5 periods after mechanical broadcasting.

2.2. UAV Image Acquisition

The UAVs used in this study are the Phantom 4 Multispectral and the Matrice M300 RTK equipped with a DJI L1 lens (SZ DJI Technology Co., Ltd., Shenzhen, China), as shown in Figure 2, produced by DJI. The image overlap rate between the primary and main flight lines is 80% and 60%, respectively. It captures spectral information from five bands: R (650 nm ± 16 nm), G (560 nm ± 16 nm), B (450 nm ± 16 nm), NIR (840 nm ± 26 nm), and RE (730 nm ± 16 nm). The DJI L1 lens mounted on the Matrice M300 RTK has an effective point cloud ratio of 100%, a measuring distance of 450 m, a reflectance of 80%, 0klx, and a practical point cloud data rate of 240,000 points/s, with three echoes. The UAV flies under clear weather and low wind speed conditions from 10 am to 2 pm local time, at a height of 20 m above the test site, at a constant speed of 1.7 m/s. There are 49 waypoints, the flight line length is 2958 m, the number of main flight lines is 24, the gimbal pitch angle is 90°, the shooting interval is 2 s, the flight direction overlap rate is 75%, and the side overlap rate is 70%.

2.3. AGB Determination

The sampling date is one day before or after the UAV activity and on-site spectral radiometer measurement. The samples were subjected to different levels of nitrogen fertilizer and planting density, a total of 360 for fresh biomass; a 50 cm × 50 cm frame was placed in each plot, and all the rapeseed plants within the frame were dug out for sampling and brought back to the laboratory, where the samples were cleaned, the roots were cut off, and the stems, leaves, and ears were weighed. In the next step, the samples were dried at 70 °C for 120 h, and the dry biomass of each plant was weighed again.

2.4. Image Feature Extraction

The acquired visible images of rapeseed are calibrated with a whiteboard and then stitched together using DJI Terra to obtain orthoimages, digital surface models (DSMs), and a digital elevation model (DEM) of the experimental site. ArcGIS and ENVI5.3 are used to crop the images. The DN values of the R, G, and B channels are extracted from the acquired UAV images. After normalization, r, g, and b are obtained. The calculation formulas are as follows:
r = R R + G + B
g = G R + G + B
b = B R + G + B
After obtaining r, g, and b, they are mathematically combined to obtain the color index. The color index calculated from the three components is significantly related to the dynamic changes in crop growth. This paper summarizes 12 standard color indices (Table 1) through a literature review used for AGB estimation research. The calculation formulas are as follows:
The vegetation index was obtained by multispectral image extraction; vegetation indices are obtained, which serve as essential indices for measuring crop growth and distinguishing crop feature types. Based on previous research, this paper summarizes 13 standard vegetation indices (Table 2) used for AGB estimation research. The calculation formulas are as follows:
Texture features refer to extracting texture feature parameters through specific image processing techniques to obtain a quantitative or qualitative texture description. Texture features are the leading indicators for target detection and image classification, and they are widely used in AGB estimation. The gray level co-occurrence matrix (GLCM) was proposed by Haralick [44]. This algorithm creates a corresponding GLCM based on the relative position of the image to represent the spatial gray level dependence between image pixels and then realizes the relevant texture information of the image. Although the GLCM provides information about image gray levels’ direction, interval, and change amplitude, it cannot directly provide characteristics to distinguish textures. Therefore, statistical properties that quantitatively describe texture features based on GLCM must be calculated. The calculation formulas for the five commonly used texture feature statistical properties are as follows:
C o n t r a s t = i , j = 0 N 1 P i , j ( i j ) 2
C o r r e l a t i o n = i , j = 0 N 1 P i , j [ ( i μ i ) ( i μ j ) / ( σ i 2 ) ( σ j 2 ) ]
Energy = i j p ( i , j ) ln p ( i , j )
Homogeneity = i , j = 0 N 1 P i , j 1 + ( i j ) 2
ASM = i j p ( i , j ) 2

2.5. LiDAR Feature Extraction

A UAV-based LiDAR scanning system is used to fly over the fields and collect high-density LiDAR point cloud data and image data. DJI Pilot (SZ DJI Technology Co., Ltd., Shenzhen, China) processes the raw data to generate true-color point cloud data. The ArcGIS 10.8 software is used to import the point cloud data and create an LAS dataset. The LAS dataset undergoes preprocessing, including noise removal, ground point separation, and DEM generation. The LAS dataset is then normalized, i.e., the DEM height is subtracted from the point cloud height to obtain the normalized height. The normalized LAS dataset undergoes vegetation height extraction using a grid-based interpolation method, resulting in a vegetation height grid layer or feature class.
H = R 2 R 1 2 · sin ( θ )
where H is the height of the plant, R1 is the distance from the radar to the ground, R2 is the distance from the radar to the top of the plant, and θ is the incidence angle of the radar beam.

2.6. Modeling and Validation

This paper collects 360 sample data points during the critical growth period. Of these, 70% are used for modeling, and 30% are used for validation. Three models, random forest (RF), linear regression (LR), and Extreme Gradient Boosting (XGBoost), are compared. The Particle Swarm Optimization (PSO) algorithm is employed to fine-tune the parameters and select the most effective method [45]. RF is a combination of tree predictors, where each tree depends on the values of a random vector sampled independently and with the same distribution for all trees in the forest [46]. LR is a supervised learning algorithm that predicts a continuous target variable. Linear regression assumes a linear relationship between the target variable and the input variables [47]. XGBoost proposed a new sparse data-aware algorithm and a weighted quantile sketch for approximate tree learning. More importantly, it provides insights about cache access patterns, data compression, and sharding to build a scalable tree-boosting system. Combining these insights allows XGBoost to scale to billions of examples using far fewer resources than existing systems [48]. This paper uses the coefficient of determination (R2) as the evaluation criterion for the model, uses the root mean square error (RMSE) to evaluate the accuracy of the model, and uses the normalized root mean square error (nRMSE) to describe the accuracy of the model. The calculation formulas for R2, RMSE, and nRMSE are as follows:
R 2 = 1 i = 1 N ( y i y i ^ ) 2 i = 1 N ( y i y i ¯ ) 2
RMSE = i = 1 N ( y i y i ^ ) 2 N
MAE = i = 1 N y i ^ y i N
where xi and yi are the measured and estimated values of the model, respectively; n is the sample size; 0020 x ¯ is the measured mean.

3. Results

3.1. Feature Correlation Analysis

In order to investigate the relationship between the biomass of rapeseed and spectral data, different spectral indices were extracted for each band combination. This was performed by exploring various band combinations. Under the RGB band combination, the spectral indices EXG, Coverage, INT, IKAW, VARI, ExR, GLI, ExGR, NGRDI, NGBDI, MGRVI, RGBVI, and RGRI were extracted, and the texture features contrast, correlation, energy, homogeneity, and ASM were calculated. Under the GRN band combination, the spectral indices Coverage, NDVI, NDWI, VI2, GVI, SAVI, GNDVI, OSAVI, and TVI were extracted. Under the GRNRE band combination, the spectral indices NDVI, NDWI, NDRE, VI1, VI2, CI, GVI, SAVI, GNDVI, OSAVI, TVI, NDCI, and CRI550 were extracted. This study used the Pearson correlation and analysis method to screen rapeseed’s spectral data and selected the spectral indices strongly correlated with AGB (Figure 3). Under the RGB band combination, four spectral indices, VARI, GLI, NGRDI, and MGRVI, were selected because their correlation with AGB was more significant than 0.69, while the rest were less than 0.55. Under the GRN and GRNRE band combination, four spectral indices, NDVI, GVI, SAVI, and OSAVI, were selected because their correlation with AGB was more significant than 0.71, while the rest were less than 0.49. Additionally, under the GRNRE band combination, an extra index, CI, was selected due to its correlation coefficient with AGB reaching 0.75. Under the GRNRE band combination, this study selected five spectral indices: NDVI, CI, GVI, SAVI, and OSAVI. Through these spectral indices, this study can more accurately estimate the AGB of rapeseed, providing a basis for the growth monitoring and management of rapeseed.
There is no collinearity between the spectral data and LiDAR data of rapeseed, meaning there is no significant linear correlation between them. This indicates that the spectral and LiDAR data reflect different aspects of the characteristics of rapeseed and can complementarily provide information on its AGB. Therefore, this study will attempt to use the combination of spectral data and LiDAR data to construct a more accurate estimation model for the AGB of rapeseed.

3.2. Image Feature Modeling Results

This study is based on data on rapeseed with different treatments and varieties. The mean is also an essential aspect that aids us in assessing accuracy (Table 3). After conducting a correlation analysis between the obtained image features and AGB, these image features were used to construct a random forest regression model to estimate the AGB of rapeseed (Figure 4). The study found that under the RGB band combination, when not distinguishing between rapeseed varieties and nitrogen fertilizer treatments, the R² in the random forest regression model constructed through image features and AGB was 0.42, and the RMSE was 1503.66 kg/hm2. Under the GRN band combination, when not distinguishing between rapeseed varieties and nitrogen fertilizer treatments, the R² in the random forest regression model constructed through image features and AGB was 0.43, and the RMSE was 1438.87 kg/hm2. Under the GRNRE band combination, when not distinguishing between rapeseed varieties and nitrogen fertilizer treatments, the R² in the random forest regression model constructed through image features and AGB was 0.58, and the RMSE was 1217.40 kg/hm2. The prediction effect is relatively poor compared to the regression model that only uses image features.

3.2.1. Results of the Combination of Image and LiDAR

This study is based on data on rapeseed with different treatments and varieties. It selected image features strongly correlated with AGB under the three band combinations of RGB, GRN, and GRNRE (Figure 5), including color and vegetation indices. Then, these image features were combined with the elevation data of LiDAR data to construct a random forest regression model, linear regression model, and XGBoost model to estimate the AGB of rapeseed. The study found that under the RGB band combination, when not distinguishing between rapeseed varieties and nitrogen fertilizer treatments, the R² in the random forest regression model constructed through image features and LiDAR data with AGB was 0.56, and the RMSE was 1291.59 kg/hm2. Under the GRN band combination, the R² was 0.56, and the RMSE was 1232.38 kg/hm2. Under the GRNRE band combination, the R² was 0.67, and the RMSE was 1056.86 kg/hm2. After incorporating the elevation data, the R² value improved, and the RMSE significantly decreased. The combinations of the RGB, GRN, and GRNRE bands, respectively, decreased by 284.07 kg/hm2, 206.49 kg/hm2, and 160.54 kg/hm2. The prediction effect obtained is relatively poor compared to the regression model that only uses image features.
In this study, a new approach was attempted. Namely, the construction of regression models relied on image features and incorporated LiDAR data into the models. This attempt yielded significant results, with a noticeable improvement in the predictive accuracy of the models. This discovery provides a new perspective; that is, when constructing regression models, we should consider and utilize more data types, such as LiDAR data. Furthermore, LiDAR data can also be introduced into the AGB estimation models of other crops, thereby playing a more significant role. In future research on other crops, introducing more data types can further improve the predictive accuracy of the models.

3.2.2. The Results between Different Varieties

In this study, two varieties of rapeseed, namely QY01 and ZY03, were used. Both varieties used the image features obtained under the RGB band combination and LiDAR data to construct a random forest model (Figure 6). The R2 of QY01 was 0.45, the RMSE was 1077.41 kg/hm2, the R2 of ZY03 was 0.78, and the RMSE was 1112.20 kg/hm2. It can be seen that under the same conditions, the AGB estimation of QY01 did not achieve a positive effect, while the R2 of the AGB estimation of ZY03 increased by 0.33, and the RMSE decreased by 44.79 kg/hm2. In summary, the estimation model based on image features and LiDAR data has significantly improved AGB estimation accuracy for ZY03.

3.2.3. The Results of Different Nitrogen Application Rates

This study has two nitrogen fertilizer treatments, 225 kg/hm2 and 270 kg/hm2 (from now on referred to as low-nitrogen and high-nitrogen treatments). Both nitrogen fertilizer treatments use the image features and LiDAR data obtained under the RGB band combination to construct a random forest model (Figure 7). The R2 under high-nitrogen treatment is 0.52, and the RMSE is 1246.47 kg/hm2; the R2 under low-nitrogen treatment is 0.67, and the RMSE is 1132.34 kg/hm2. When distinguishing between nitrogen fertilizer treatments, the R2 of the high-nitrogen treatment increased by 0.15, and the RMSE decreased by 114.13 kg/hm2. Although the low-nitrogen treatment did not improve, it did not decrease either.

3.2.4. The Results of Different Planting Densities

This study has ten density levels, specifically 60,000, 120,000, 180,000, 240,000, 300,000, 360,000, 420,000, 480,000, 540,000, and 600,000 plants per ha. These ten density levels are used to construct a random forest model (Figure 8) using the image features and LiDAR data obtained under the combination of RGB bands. As can be seen from the figure, the R2 accuracy is relatively high at the density levels of 60,000, 300,000, and 540,000 plants per hectare with R2 values of 0.69, 0.69, and 0.81, and RMSE values of 1596.06 kg/hm2, 1399.56 kg/hm2, and 861.87 kg/hm2, respectively.

3.2.5. Cross-Validation

To ensure the stability and reliability of the experiment, we used cross-validation to validate the regression model under various conditions. We uniformly used the best-performing GRNRE band combination for validation (Figure 9). Under all sample conditions, the R² was 0.73, and the RMSE was 975.29 kg/hm². When distinguishing varieties, the R² for QY01 was 0.64, and the RMSE was 1000.97 kg/hm², while for ZY03, the R² was 0.80, and the RMSE was 903.69 kg/hm². When distinguishing nitrogen application amounts, the R² for HN was 0.72, and the RMSE was 1024.54 kg/hm², while for LN, the R² was 0.70, and the RMSE was 975.46 kg/hm². The results indicate that the progress of this study is reliable.

3.3. Comparison Results of Different Models

Through the performance of the models above, the results show that overall, the XGBoost model is relatively balanced, whether in overall estimation or in distinguishing each variety and different nitrogen application levels. In the case of distinguishing varieties, each model has its advantages, but the best results are obtained under the GRNRE band combination. In the case of different nitrogen fertilizer treatments, the best results are also obtained with the GRNRE band combination type (Figure 10).

4. Discussion

This study improves AGB prediction accuracy through spectral features by incorporating LiDAR data and deep learning. Our study, which uses color and vegetation indices to estimate rapeseed AGB fairly accurately, is consistent with the conclusions of other researchers but could have achieved higher accuracy. Regarding model comparison, this study selected models based on spectral indices and models incorporating elevation data. For models based on spectral indices, the best prediction effect under all samples was achieved by the GRNRE band combination with an R² of 0.68. When distinguishing rapeseed varieties, the best prediction effect for QY01 was achieved by the GRNRE band combination with an R² of 0.55. At the same time, ZY03 had the best estimation effect with the RGB band combination with an R² of 0.76. When distinguishing nitrogen application amounts, HN and LN had the best prediction effects with the RGB band combination, with R² values of 0.57 and 0.63, respectively. LiDAR estimation of AGB is feasible, as mentioned in Jin’s study, which combined the AquaCrop (Rome, Italy) model with optical and LiDAR imaging data using a location and direction system algorithm to develop a method for estimating winter wheat AGB, making the predicted AGB highly correlated with the measured AGB, which is also consistent with our research [49]. Many studies have shown that plant height can be used to estimate crop yield. Ji et al. obtained the plant height of broad beans from UAV images. They used machine-learning algorithms to explore the relationship between plant height data at different time points and combinations of time points and yield, thereby estimating broad bean yield [50]. Feng et al. used low-cost UAVs to obtain cotton image data and evaluated the feasibility of using image-based plant height to estimate cotton yield [51]. Adark et al. used a machine-learning-based regression algorithm to better predict corn yield and flowering time using the time vegetation index and plant height [52]. Tao et al. improved the accuracy of the model predicting wheat yield by adding the crop plant height extracted from UAV hyperspectral measurements to the prediction model. Most of them improve model prediction accuracy by directly establishing a prediction model with yield through plant height [53]. We combine the two to improve the estimation accuracy of AGB further. For models incorporating elevation data, the best prediction effect under all samples was achieved by the GRNRE band combination with an R² of 0.69. When distinguishing rapeseed varieties, the best prediction effect for QY01 was achieved by the GRNRE band combination with an R² of 0.56. At the same time, ZY03 had the best estimation effect with the GRN band combination with an R² of 0.78. When the GRNRE band combination with an R² of 0.64 achieved distinguishing nitrogen application amounts, the best prediction effect for HN was achieved. At the same time, LN had the best prediction effect with the GRN band combination with an R² of 0.67.
Regarding the discussion of flight height and plant height measurement accuracy, the error is not significant with the increase in height, which is consistent with the research results of Wu, who found in the height measurement research of unmanned aerial vehicles equipped with LiDAR systems that the plant height measurement accuracy is reasonable under the condition of reducing the measurement height (measurement height is less than 20 m) [54]. Seifert et al. evaluated accurately estimating potato crop growth characteristics at different UAV flight heights. A UAV equipped with a multispectral camera flew at heights of 15 and 30 m over experimental fields planted with various potatoes; evaluated the characteristics of plant height, volume, and NDVI; and compared them with manually obtained parameters [55]. At 15 m and 30 m, the UAV-measured plant height was significantly linearly correlated with the manually estimated plant height, with correlation coefficients of 0.80 and 0.75, respectively [56].
In terms of model comparison, this study selected three representative models for comparison: random forest, linear regression, and XGBoost. We classified modeling for three band combinations. Under all samples, the R² of the LG, RF, and XGBoost regression models was 0.65, 0.67, and 0.69, respectively. When distinguishing varieties, the R² for variety QY01 was 0.42, 0.45, and 0.56, respectively, while for variety ZY03, the R² was 0.76, 0.78, and 0.65, respectively. When distinguishing nitrogen application amounts, the R² for HN was 0.57, 0.52, and 0.64, respectively, while for LN, the R² was 0.63, 0.67, and 0.55, respectively. It can be seen that the XGBoost model performs stably and well in all situations; only in the case of distinguishing variety ZY03 and nitrogen application amount LN does RF perform excellently. In summary, XGBoost is this study’s most suitable model for predicting rapeseed biomass. Sara et al. used three different AGB prediction models, compared the accuracy of each model in the AGB prediction task, and also compared and evaluated the AGB prediction maps to assess their ability to reconstruct the fundamental biomass dynamics [57]. Zhang et al. comprehensively evaluated eight machine-learning regression algorithms for forest AGB estimation based on multiple satellite data products and found the most suitable model [58]. Han et al. evaluated and compared four machine-learning regression algorithms (multiple linear regression algorithm, support vector machine algorithm, artificial neural network algorithm, and random forest algorithm) to create a suitable model and then tested whether two sampling methods would affect the training model [59]. This study proves that this method has strong applicability under different varieties and different nitrogen application conditions, indicating its wide range of practicality. Expansion research and prospects of this study: Our method can also measure AGB, such as wheat and rice. With the integrated development of sensor technology, the application cost and operation are gradually simplified.

5. Conclusions

This study designed field experiments for winter rapeseed with different varieties and different nitrogen applications. UAV images and elevation information of rapeseed were obtained through UAVs. This study is based on the combination of RGB, GRN, and GRNRE bands, extracted image features strongly correlated with AGB, including color and vegetation indices, and calculated the elevation difference using elevation information. The AGB of rapeseed was modeled and validated using random forest, linear regression, and XGBoost models. The results of this study show the following: (1) Among the three regression models, XGBoost performs best overall, whether in overall estimation or different varieties and nitrogen fertilizer treatments, and it is better than the random forest models and the linear regression models. (2) The modeling effect has been dramatically improved after adding elevation data. When not distinguishing varieties and nitrogen fertilizer treatments, R2 is 0.69; when distinguishing rapeseed varieties without distinguishing nitrogen fertilizer treatments, the R2 of QY01 and ZY03 is 0.56 and 0.78, respectively; when distinguishing nitrogen fertilizer treatments without distinguishing rapeseed varieties, the R2 of high-nitrogen treatment and low-nitrogen treatment is 0.64 and 0.67, respectively. (3) This study found that the GRNRE band is the best band combination for estimating rapeseed AGB under three band combinations. The spectral indices extracted from this band combination have an excellent estimation effect in different models, varieties, and nitrogen fertilizer treatments. This study provides an effective method for the remote-sensing estimation of winter rapeseed AGB and provides a reference for the AGB estimation of other crops.

Author Contributions

Y.J.: Investigation, visualization, writing—original draft. F.W. (Fang Wu): Data curation. S.Z.: Methodology. W.Z.: Software. F.W. (Fei Wu): Conceptualization. T.Y.: Supervision. G.Y.: Formal analysis. Y.Z.: Conceptualization. C.S.: Visualization. T.L.: Funding acquisition, writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

The National Natural Science Foundation of China (32172110, 32071945), the Key Research and Development Program (Modern Agriculture) of Jiangsu Province (BE2022342-2, BE2020319), Anhui Province Crop Intelligent Planting and Processing Technology Engineering Research Center Open Project (ZHKF04), the National Key Research and Development Program of China (2018YFD0300805, 2023YFD1202200), the Special Funds for Scientific and Technological Innovation of Jiangsu Province, China (BE2022425), the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Central Public-interest Scientific Institution Basal Research Fund (JBYW-AII-2023-08), the Science and Technology Innovation Project of the Chinese Academy of Agricultural Sciences (CAAS-CS-202201), and the Special Fund for Independent Innovation of Agriculture Science and Technology in Jiangsu, China (CX(22)3112).

Data Availability Statement

The original contributions presented in the study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wu, D.; Liang, Z.; Yan, T.; Xu, Y.; Xuan, L.; Tang, J.; Zhou, G.; Lohwasser, U.; Hua, S.; Wang, H.; et al. Whole-Genome Resequencing of a Worldwide Collection of Rapeseed Accessions Reveals the Genetic Basis of Ecotype Divergence. Mol. Plant 2019, 12, 30–43. [Google Scholar] [CrossRef]
  2. Rajković, D.; Marjanović Jeromela, A.; Pezo, L.; Lončar, B.; Zanetti, F.; Monti, A.; Kondić Špika, A. Yield and Quality Prediction of Winter Rapeseed—Artificial Neural Network and Random Forest Models. Agronomy 2021, 12, 58. [Google Scholar] [CrossRef]
  3. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  4. Qiu, Y.; Li, X.; Tang, Y.; Xiong, S.; Han, Y.; Wang, Z.; Feng, L.; Wang, G.; Yang, B.; Lei, Y.; et al. Directly linking plant N, P and K nutrition to biomass production in cotton-based intercropping systems. Eur. J. Agron. 2023, 151, 126960. [Google Scholar] [CrossRef]
  5. Corti, M.; Bechini, L.; Cavalli, D.; Ben Hassine, M.; Michelon, L.; Cabassi, G.; Pricca, N.; Perego, A.; Marino Gallina, P. Early sowing dates and pre-plant nitrogen affect autumn weed control and nitrogen content of winter cover crops in rotation with spring crops. Eur. J. Agron. 2024, 155, 127140. [Google Scholar] [CrossRef]
  6. Freeman, K.W.; Girma, K.; Arnall, D.B.; Mullen, R.W.; Martin, K.L.; Teal, R.K.; Raun, W.R. By-plant prediction of corn forage biomass and nitrogen uptake at various growth stages using remote sensing and plant height. Agron. J. 2007, 99, 530–536. [Google Scholar] [CrossRef]
  7. Reynolds, C.A.; Yitayew, M.; Slack, D.C.; Hutchinson, C.F.; Huete, A.; Petersen, M.S. Estimating crop yields and production by integrating the FAO Crop specific Water Balance model with real-time satellite data and ground-based ancillary data. Int. J. Remote Sens. 2000, 21, 3487–3508. [Google Scholar] [CrossRef]
  8. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  9. Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y.; et al. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer–a case study of small farmlands in the South of China. Agric. For. Meteorol. 2020, 291, 108096. [Google Scholar] [CrossRef]
  10. Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sens. 2020, 12, 508. [Google Scholar] [CrossRef]
  11. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  12. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
  13. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  14. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of Winter Wheat Above-Ground Biomass Using Unmanned Aerial Vehicle-Based Snapshot Hyperspectral Sensor and Crop Height Improved Models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  15. Zhao, K.; Suarez, J.C.; Garcia, M.; Hu, T.; Wang, C.; Londo, A. Utility of multitemporal lidar for forest and carbon monitoring: Tree growth, biomass dynamics, and carbon flux. Remote Sens. Environ. 2018, 204, 883–897. [Google Scholar] [CrossRef]
  16. Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.G.; Rebetzke, G.J.; James, R.A.; Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High Throughput Determination of Plant Height, Ground Cover, and Above-Ground Biomass in Wheat with LiDAR. Front. Plant Sci. 2018, 9, 237. [Google Scholar] [CrossRef] [PubMed]
  17. Kross, A.; McNairn, H.; Lapen, D.; Sunohara, M.; Champagne, C. Assessment of RapidEye vegetation indices for estimation of leaf area index and biomass in corn and soybean crops. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 235–248. [Google Scholar] [CrossRef]
  18. Wang, Q.; Lu, X.; Zhang, H.; Yang, B.; Gong, R.; Zhang, J.; Jin, Z.; Xie, R.; Xia, J.; Zhao, J. Comparison of Machine Learning Methods for Estimating Leaf Area Index and Aboveground Biomass of Cinnamomum camphora Based on UAV Multispectral Remote Sensing Data. Forests 2023, 14, 1688. [Google Scholar] [CrossRef]
  19. Cen, H.; Wan, L.; Zhu, J.; Li, Y.; Li, X.; Zhu, Y.; Weng, H.; Wu, W.; Yin, W.; Xu, C.; et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods 2019, 15, 32. [Google Scholar] [CrossRef]
  20. Zhu, W.; Sun, Z.; Huang, Y.; Yang, T.; Li, J.; Zhu, K.; Zhang, J.; Yang, B.; Shao, C.; Peng, J.; et al. Optimization of multi-source UAV RS agro-monitoring schemes designed for field-scale crop phenotyping. Precis. Agric. 2021, 22, 1768–1802. [Google Scholar] [CrossRef]
  21. Núñez, J.; Otazu, X.; Fors, O.; Prades, A.; Palà, V.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef]
  22. Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef]
  23. Pocas, I.; Rodrigues, A.; Goncalves, S.; Costa, P.M.; Goncalves, I.; Pereira, L.S.; Cunha, M. Predicting Grapevine Water Status Based on Hyperspectral Reflectance Vegetation Indices. Remote Sens. 2015, 7, 16460–16479. [Google Scholar] [CrossRef]
  24. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  25. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  26. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  27. Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2015, 129, 341–351. [Google Scholar] [CrossRef]
  28. Song, X.; Wu, F.; Lu, X.; Yang, T.; Ju, C.; Sun, C.; Liu, T. The Classification of Farming Progress in Rice-Wheat Rotation Fields Based on UAV RGB Images and the Regional Mean Model. Agriculture 2022, 12, 124. [Google Scholar] [CrossRef]
  29. Evstatiev, B.; Mladenova, T.; Valov, N.; Zhelyazkova, T.; Gerdzhikova, M.; Todorova, M.; Grozeva, N.; Sevov, A.; Stanchev, G. Fast Pasture Classification Method using Ground-based Camera and the Modified Green Red Vegetation Index (MGRVI). Int. J. Adv. Comput. Sci. Appl. 2023, 14, 45–51. [Google Scholar] [CrossRef]
  30. Guo, Z.-C.; Wang, T.; Liu, S.-L.; Kang, W.-P.; Chen, X.; Feng, K.; Zhang, X.-Q.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land–based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 102239. [Google Scholar] [CrossRef]
  31. Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef]
  32. Cao, R.; Chen, Y.; Shen, M.; Chen, J.; Zhou, J.; Wang, C.; Yang, W. A simple method to improve the quality of NDVI time-series data by integrating spatiotemporal information with the Savitzky-Golay filter. Remote Sens. Environ. 2018, 217, 244–257. [Google Scholar] [CrossRef]
  33. Hassan, M.A.; Yang, M.; Rasheed, A.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef]
  34. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  35. Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 1248. [Google Scholar] [CrossRef]
  36. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  37. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  38. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  39. Schleicher, T.D.; Bausch, W.C.; Delgado, J.A.; Ayers, P.D. Evaluation and Refinement of the Nitrogen Reflectance Index (NRI) for Site-Specific Fertilizer Management; American Society of Agricultural and Biological Engineers: St. Joseph, MS, USA, 1998; p. 011151. [Google Scholar]
  40. Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  41. Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 2014, 22, 229–242. [Google Scholar] [CrossRef]
  42. PeÑUelas, J.; Filella, I.; Gamon, J.A. Assessment of photosynthetic radiation-use efficiency with spectral reflectance. New Phytol. 2006, 131, 291–296. [Google Scholar] [CrossRef]
  43. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  44. Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Communication. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybernitics 1973, SMC-3, 610–621. [Google Scholar] [CrossRef]
  45. Lynn, N.; Suganthan, P.N. Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm Evol. Comput. 2015, 24, 11–24. [Google Scholar] [CrossRef]
  46. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  47. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  48. Chen, T.; Guestrin, C. XGBoost [Z]. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
  49. Jin, X.; Li, Z.; Yang, G.; Yang, H.; Feng, H.; Xu, X.; Wang, J.; Li, X.; Luo, J. Winter wheat yield estimation based on multi-source medium resolution optical and radar imaging data and the AquaCrop model using the particle swarm optimization algorithm. ISPRS J. Photogramm. Remote Sens. 2017, 126, 24–37. [Google Scholar] [CrossRef]
  50. Ji, Y.; Liu, R.; Xiao, Y.; Cui, Y.; Chen, Z.; Zong, X.; Yang, T. Faba bean above-ground biomass and bean yield estimation based on consumer-grade unmanned aerial vehicle RGB images and ensemble learning. Precis. Agric. 2023, 24, 1439–1460. [Google Scholar] [CrossRef]
  51. Feng, A.; Zhang, M.; Sudduth, K.A.; Vories, E.D.; Zhou, J. Cotton yield estimation from UAV-based plant height. Trans. Asabe 2019, 62, 393–403. [Google Scholar] [CrossRef]
  52. Adak, A.; Murray, S.C.; Bozinovic, S.; Lindsey, R.; Nakasagga, S.; Chatterjee, S.; Anderson, S.L.; Wilde, S. Temporal Vegetation Indices and Plant Height from Remotely Sensed Imagery Can Predict Grain Yield and Flowering Time Breeding Value in Maize via Machine Learning Regression. Remote Sens. 2021, 13, 2141. [Google Scholar] [CrossRef]
  53. Tao, H.; Feng, H.; Xu, L.; Miao, M.; Yang, G.; Yang, X.; Fan, L. Estimation of the Yield and Plant Height of Winter Wheat Using UAV-Based Hyperspectral Images. Sensors 2020, 20, 1231. [Google Scholar] [CrossRef] [PubMed]
  54. Wu, X.; Li, W.; Hong, D.; Tao, R.; Du, Q. Deep Learning for UAV-based Object Detection and Tracking: A Survey. IEEE Geosci. Remote Sens. Mag. 2021, 10, 91–124. [Google Scholar] [CrossRef]
  55. Njane, S.; Tsuda, S.; Van Marrewijk, B.; Polder, G.; Katayama, K.; Tsuji, H. Effect of varying UAV height on the precise estimation of potato crop growth. Front. Plant Sci. 2023, 14, 1233349. [Google Scholar] [CrossRef] [PubMed]
  56. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef]
  57. Bjork, S.; Anfinsen, S.N.; Naesset, E.; Gobakken, T.; Zahabu, E. On the Potential of Sequential and Nonsequential Regression Models for Sentinel-1-Based Biomass Prediction in Tanzanian Miombo Forests. Ieee J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 4612–4639. [Google Scholar] [CrossRef]
  58. Zhang, Y.; Ma, J.; Liang, S.; Li, X.; Li, M. An Evaluation of Eight Machine Learning Regression Algorithms for Forest Aboveground Biomass Estimation from Multiple Satellite Data Products. Remote Sens. 2020, 12, 4015. [Google Scholar] [CrossRef]
  59. Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef]
Figure 1. A geographical coordinate map of the experimental field.
Figure 1. A geographical coordinate map of the experimental field.
Agronomy 14 01610 g001
Figure 2. Technical route map.
Figure 2. Technical route map.
Agronomy 14 01610 g002
Figure 3. Correlation analysis diagram.
Figure 3. Correlation analysis diagram.
Agronomy 14 01610 g003
Figure 4. (A) represents the 1:1 line plot of the random forest model under the RGB band for all samples, (B) represents the 1:1 line plot of the random forest model under the GRN band for all samples, and (C) represents the 1:1 line plot of the random forest model under the GRNRE band for all samples. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Figure 4. (A) represents the 1:1 line plot of the random forest model under the RGB band for all samples, (B) represents the 1:1 line plot of the random forest model under the GRN band for all samples, and (C) represents the 1:1 line plot of the random forest model under the GRNRE band for all samples. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Agronomy 14 01610 g004
Figure 5. (A) represents the 1:1 line plot of the random forest model constructed with the addition of LiDAR data under the RGB band for all samples, (B) represents the 1:1 line plot of the random forest model under the GRN band for all samples, and (C) represents the 1:1 line plot of the random forest model under the GRNRE band for all samples. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Figure 5. (A) represents the 1:1 line plot of the random forest model constructed with the addition of LiDAR data under the RGB band for all samples, (B) represents the 1:1 line plot of the random forest model under the GRN band for all samples, and (C) represents the 1:1 line plot of the random forest model under the GRNRE band for all samples. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Agronomy 14 01610 g005
Figure 6. (A) The QY01 joins the random forest model built with LiDAR data, presenting a 1:1 line graph in the RGB band. (B) represents the 1:1 line graph in the GRN band of the random forest model for ZY03. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Figure 6. (A) The QY01 joins the random forest model built with LiDAR data, presenting a 1:1 line graph in the RGB band. (B) represents the 1:1 line graph in the GRN band of the random forest model for ZY03. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Agronomy 14 01610 g006
Figure 7. (A) The 1:1 line graph in the RGB band of the random forest model built with LiDAR data under high-nitrogen treatment. (B) represents the 1:1 line graph in the GRN band of the random forest model under low-nitrogen treatment. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Figure 7. (A) The 1:1 line graph in the RGB band of the random forest model built with LiDAR data under high-nitrogen treatment. (B) represents the 1:1 line graph in the GRN band of the random forest model under low-nitrogen treatment. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Agronomy 14 01610 g007
Figure 8. A comparison of the accuracy of the random forest model constructed through image features and LiDAR data under ten density levels.
Figure 8. A comparison of the accuracy of the random forest model constructed through image features and LiDAR data under ten density levels.
Agronomy 14 01610 g008
Figure 9. The results were cross-verified ten times. (A) for all samples, (B) for variety QY01, (C) for variety ZY03 (D) for high nitrogen application rate, and (E) for low nitrogen application rate. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Figure 9. The results were cross-verified ten times. (A) for all samples, (B) for variety QY01, (C) for variety ZY03 (D) for high nitrogen application rate, and (E) for low nitrogen application rate. The light blue shaded area around the regression line represents the 95% confidence interval. The green dots in the plot represent the relationship between the true values (measured above-ground biomass, AGB) and the estimated values (predicted AGB) from your model.
Agronomy 14 01610 g009
Figure 10. Comparison of accuracy between linear regression, random forest, and XGBoost models under various conditions.
Figure 10. Comparison of accuracy between linear regression, random forest, and XGBoost models under various conditions.
Agronomy 14 01610 g010
Table 1. Color index.
Table 1. Color index.
Color IndexCalculation FormulaReferences
INT ( r + g + b ) / 3 [21]
IKAW ( r b ) / ( r + b ) [22]
VARI ( g r ) / ( g + r b ) [23]
ExR 1.4 r g [24]
ExG 2 g r b [25]
GLI ( 2 g r b ) / ( 2 g + r + b ) [26]
ExGR 3 g 2.4 g b [24]
NGRDI ( g r ) / ( g + r ) [27]
NGBDI ( g b ) / ( g + b ) [28]
MGRVI ( g 2 r 2 ) / ( g 2 + r 2 ) [29]
RGBVI ( g 2 b r ) / ( g 2 + b r ) [30]
RGRI r / g [31]
Table 2. Vegetation index.
Table 2. Vegetation index.
Vegetation IndexCalculation FormulaReferences
NDVI ( N i r R ) / ( N i r + R ) [32]
NDREI ( Nir RedE ) / ( Nir + RedE ) [33]
EVI 2.5 ( Nir R ) / ( 1 + Nir + 6 R 7.5 B ) [34]
GNDVI ( Ni r G ) / ( Nir + G ) [35]
OSAVI 1.16 ( N i r R ) / ( N i r + R + 0.16 ) [36]
MCARI [ ( R E R ) 0.2 ( R E G ) ] ( R E / R ) [37]
TCARI 3 [ ( R E R ) 0.2 ( R E G ) ( R E / R ) ] [38]
NRI ( G R ) / ( G + R ) [39]
TVI N D V I + 0.5 [40]
MSR [ ( N I R / R ) 1 ) ] / N I R / R + 1 [41]
SIPI R ( N R I B ) / ( N R I + B ) [42]
PSRI ( R B ) / N I R [43]
CIRE ( Nir / RedE ) 1 [35]
Table 3. Mean values.
Table 3. Mean values.
ClassificationMean Value (kg/hm2)
Not distinguishAll samples
8252.11
VarietyQY01ZY03
8533.417971.60
Nitrogen application rateHNLN
8428.258076.77
Planting density4000800012,00016,00020,00024,00028,00032,00036,00040,000
6514.567945.347691.117665.428142.958361.439389.909212.808615.988985.49
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jiang, Y.; Wu, F.; Zhu, S.; Zhang, W.; Wu, F.; Yang, T.; Yang, G.; Zhao, Y.; Sun, C.; Liu, T. Research on Rapeseed Above-Ground Biomass Estimation Based on Spectral and LiDAR Data. Agronomy 2024, 14, 1610. https://doi.org/10.3390/agronomy14081610

AMA Style

Jiang Y, Wu F, Zhu S, Zhang W, Wu F, Yang T, Yang G, Zhao Y, Sun C, Liu T. Research on Rapeseed Above-Ground Biomass Estimation Based on Spectral and LiDAR Data. Agronomy. 2024; 14(8):1610. https://doi.org/10.3390/agronomy14081610

Chicago/Turabian Style

Jiang, Yihan, Fang Wu, Shaolong Zhu, Weijun Zhang, Fei Wu, Tianle Yang, Guanshuo Yang, Yuanyuan Zhao, Chengming Sun, and Tao Liu. 2024. "Research on Rapeseed Above-Ground Biomass Estimation Based on Spectral and LiDAR Data" Agronomy 14, no. 8: 1610. https://doi.org/10.3390/agronomy14081610

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop