Next Article in Journal
Effect of Vermicompost Application on the Soil Microbial Community Structure and Fruit Quality in Melon (Cucumis melo)
Previous Article in Journal
Fusariotoxins Concentration in Common Wheat Grain Depending on the Farming System (Organic vs. Integrated vs. Conventional) and Changes During Grain Processing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Advanced Plant Phenotyping: Unmanned Aerial Vehicle Remote Sensing and CimageA Software Technology for Precision Crop Growth Monitoring

1
Department of Intelligent Agriculture, College of Agriculture, Hunan Agricultural University, Changsha 410128, China
2
Institute of Agricultural Environment and Ecology, Hunan Academy of Agricultural Sciences, Changsha 410125, China
*
Authors to whom correspondence should be addressed.
Agronomy 2024, 14(11), 2534; https://doi.org/10.3390/agronomy14112534
Submission received: 2 September 2024 / Revised: 4 October 2024 / Accepted: 23 October 2024 / Published: 28 October 2024
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
In production activities and breeding programs, large-scale investigations of crop high-throughput phenotype information are needed to help improve management and decision-making. The development of UAV (unmanned aerial vehicle) remote sensing technology provides a new means for the large-scale, efficient, and accurate acquisition of crop phenotypes, but its practical application and popularization are hindered due to the complicated data processing required. To date, there is no automated system that can utilize the canopy images acquired through UAV to conduct a phenotypic character analysis. To address this bottleneck, we developed a new scalable software called CimageA. CimageA uses crop canopy images obtained by UAV as materials. It can combine machine vision technology and machine learning technology to conduct the high-throughput processing and phenotyping of crop remote sensing data. First, zoning tools are applied to draw an area-of-interest (AOI). Then, CimageA can rapidly extract vital remote sensing information such as the color, texture, and spectrum of the crop canopy in the plots. In addition, we developed data analysis modules that estimate and quantify related phenotypes (such as leaf area index, canopy coverage, and plant height) by analyzing the association between measured crop phenotypes and CimageA-derived remote sensing eigenvalues. Through a series of experiments, we confirmed that CimageA performs well in extracting high-throughput remote sensing information regarding crops, and verified the reliability of retrieving LAI (R2 = 0.796) and estimating plant height (R2 = 0.989) and planting area using CimageA. In short, CimageA is an efficient and non-destructive tool for crop phenotype analysis, which is of great value for monitoring crop growth and guiding breeding decisions.

1. Introduction

Under the threat of adverse conditions like environmental degradation, resource scarcity, and the rapid growth of the global population, the expeditious selection of crop varieties with a high yield, high quality and strong stress resistance is becoming increasingly urgent for breeders worldwide [1,2]. Phenotypic traits are the key indicators used to evaluate the responses of different genetic varieties to environmental influences to improve breeding, and they are very effective for use in investigating the phenotypic traits of large-scale populations to promote crop breeding [3,4,5].
In the past, phenotypic measurement in breeding programs was a time-consuming and labor-intensive task, since it involved the frequent periodic monitoring of hundreds of thousands of crop genotypes [6,7]. In addition, this traditional manual sampling method has the disadvantages of large amounts of error, high costs, and the risk of damage to plants, so it does not meet the development requirements of modern agriculture. There is an urgent need to develop reliable, efficient, and high-throughput phenotyping methods to provide new insights and support to breeders [8,9].
To overcome the difficulties of traditional phenotypic measurement methods, many high-throughput plant phenotyping platforms have been designed. These tools usually use specific correlations between crop phenotypic traits [10] and spectral data, or directly obtain the multidimensional phenotypic information of crop organs, plants, and populations from reconstructed plant skeletal structures [11]. For example, the Trait Mill platform developed by Crop Design in 1998 is an early prototype of a high-throughput phenomic research facility, which has been applied to the evaluation and screening of rice genes and their functions, as well as being applied to crop breeding [12]. The Bellwether phenotype platform (St. Louis, MO, USA) at the Danfoss Plant Science Center can accommodate more than 1000 plant pots and monitors crop height, biomass, water status, and other information [13]. The Plant Phenomics Analysis Platform (PPAP) was established by the Institute of Genetics and Developmental Biology (IGDB) of the Chinese Academy of Sciences (CAS). PPAP consists of eight integrated units, including visible imaging, infrared imaging, near-infrared (NIR) imaging, root NIR imaging, fluorescence imaging, chlorophyll fluorescence imaging, hyperspectral imaging, and LiDAR imaging. Although there have been some advances in the phenotyping platforms, which are of great value to researchers investigating plant phenotypic traits, such fixed phenotyping platforms are expensive to build and maintain [14] and cannot be easily used due to the limited accessibility of the custom hardware, proprietary software, and specialized software packages [15,16].
In recent years, UAV remote sensing systems have gradually attracted more and more attention, and field crop phenotype analyses based on UAV remote sensing have played a crucial role in crop breeding [17]. The UAV can be equipped with various types of sensors (multispectral, hyperspectral, LiDAR, thermal imaging, etc.) to form a noncontact phenotype measurement platform. These platforms are characterized by their simple deployment, strong flexibility, and wide coverage, as well as the ability to obtain high-spatial-resolution image information. The image information obtained by UAV remote sensing platforms is an important basis for the extraction of key agronomic traits because crop plants have specific spectral absorption and radiation characteristics, and the information extracted by UAV remote sensing can reflect the differences in crops’ morphological and biochemical characteristics. To extract more useful biological-related traits, some studies combined the morphological, spectral, and textural eigenvalues derived from remote sensing images and the measured phenotypic data, and used machine learning methods (random forests, decision trees, support vectors, etc.) or simple linear regression methods for the non-destructive detection of many phenotypic traits, such as chlorophyll content [18], LAI [19], moisture content [20], and biomass.
Compared to other sensors carried by UAV, multispectral sensors have the unique advantage of providing information about the major spectral bands associated with plant phenotypes at a lower cost than other sensors. Although many studies have proven that crop phenotype analyses based on UAV multispectral remote sensing platforms are feasible, there are still major obstacles to the application of this technology for non-experts, because multispectral imaging generates a large amount of data, and analyzing and managing remote sensing imaging data is a great challenge. Some scholars have developed some pipelines and algorithms to solve the problem of data processing [21], but these algorithms usually involve only part of the operational process. Other scholars have made great efforts in simplifying and optimizing the processing; for example, Wang et al. [22] developed HSI-PP, which can effectively simplify the process of extracting data (such as image features, vegetation index, and structural information).
The ultimate goal of field crop phenotyping research based on UAV remote sensing is to develop tools that can analyze crop phenotypes in a high-throughput, precise, efficient, and automated manner [23]. Considering the importance of this tool and the fact that very few studies have developed a tool specifically for UAV remote sensing imagery, including high-throughput crop remote sensing information extraction and phenotype analysis modules, we chose to carry out this study. The objectives of this study were as follows: (1) to develop an easy-to-use automatic analysis software for remote sensing images to quantify crop canopies’ morphology, spectral features, texture, and other remote sensing feature information; (2) integrate various modeling algorithms or machine learning algorithms to construct a crop phenotype inversion model, using the software-derived remote sensing eigenvalues as inputs; and (3) test the platform and its utility for research.

2. Materials and Methods

2.1. Field Trial and Materials Acquisition

The experimental area is located at Hunan agricultural university, Changsha, Hunan province, China (28°11′ N, 113°04′ E). A variety of crops (such as ramie, jute, and cabbage) and crop resources are grown in the area. This study focused on ramie. The ramie planting area consisted of 72 plots with 36 ramie germplasm resources planted in total. Each variety was replicated twice, and each plot was about 1.5 × 1 m, with 2 rows of 3 stumps and a root–root spacing of 0.2 m, row spacing of 0.3 m, and drainage ditch width of 0.5 m.
The UAV remote sensing images and ground-measured data used for the test were collected from March to May 2023, and a total of five periods of data were obtained (13 March, 26 March, 9 April, 26 April, and 7 May). The UAV remote sensing images were acquired using DJI Phantom 4 pro (DJI Innovation Technology Co., Ltd., Shenzhen, China), and then these acquired images underwent radiometric calibration, geometric correction and 3D reconstruction in the DJI Terra to finally generate high-resolution orthophoto, multi-channel spectral reflectance images (Red, Blue, Green, near-infrared, and red-edge) and a high-precision crop digital surface model (DSM) of the test (Figure 1).

2.2. Overview of CimageA

CimageA is a software developed based on Matlab, which allows users to extract high-throughput remote sensing feature data of crop canopy imagery captured by UAV, and provides a series of modeling algorithms to support crop phenotype inversion. The software presents the visual results of the phenotypic analyses. Figure 2 shows the user interface of CimageA. The menu includes five components: (1) File processing, including image import, image saving, image information display and image size consistency verification. (2) Image processing and parameter settings, including image tilt correction, boundary demarcation and scale correction. (3) Data processing, including data clearing, data copying, and data saving. (4) Data analysis, including data correlation analysis, model selection, eigenvalue selection, model training, model running and result analysis. (5) Phenotypic visualization.
The operation process is as follows. In the first step, click “File processing—Open image”, select the folder where the images are stored, and import the correct image format according to the requirements. The images that can be imported include the visible band image, blue band image, green band image, red band image, Red-edge image, NIR image and DSM image. After the images are exported, you can determine the information of the imported images by “Opening image file information—File size consistency suggestion”. The second step is to select the way to draw boundaries according to the requirements and delimit the AOI (area-of-interest) to extract the data. If you choose “Automatic demarcation”, you can set the line number and the interval in “Image processing—Parameter setting”. In the third step, click “Data Processing—Save data” to extract and export the remote sensing eigenvalues of the AOI. In the fourth step, switch to the “Data analysis” interface, click “Data source” to import the data for inversion, define the measured phenotype data as Y value and other remote sensing eigenvalues as X values, click “Correlation analysis” to realize the correlation analysis between phenotype data and remote sensing eigenvalues, and click “Regression model” to select the appropriate model for training (Figure 3).

2.3. Function Realization

2.3.1. Tilt Correction

The spliced orthophoto often has an inclination angle with the horizontal direction, which visually affects subsequent AOI drawings. After importing the image, using the “tilt correction” function, the software can rotate the image at any angle by customizing a calibration straight line, and the final displayed image will be parallel to the drawn calibration straight line (Figure 4).

2.3.2. Intelligent Drawing of AOI

CimageA offers users the option of either “Manual zoning” or “Automatic zoning” to generate plots with customized ranges and batch numbering. “Manual zoning” needs to draw the first plot, and then click “Copy AOI” to paste it as a template to generate other plots, which is suitable for irregular field plot division and provides a more flexible operation. “Automatic zoning” includes two ways of “Diagonal zoning” and “Hypotenuse zoning”, which is suitable for regular fields, and can be achieved automatically by setting parameters such as the number of rows, the number of columns, the interval, and the tilt angle (Figure 5). “Diagonal zoning” evenly divides the field into rectangular areas following the principle of right-angle diagonals, whereas “Hypotenuse zoning” takes the hypotenuse as the reference line, which ultimately results in dividing the field into identical rectangles. When processing remote sensing imagery containing hundreds of thousands of plots, “Automatic zoning” can provide a quick and convenient approach.

2.3.3. Plant Segmentation

The remote sensing imagery of crop canopy captured by the UAV system can maximize the information of the unfolded leaves on the top of the plant. However, the imagery will inevitably contain non-crop information such as shadows and bare ground, which will affect the eigenvalue extraction of the crop plant. CimageA uses a threshold segmentation method to generate specific targets of interest and eliminate non-plant parts. The threshold segmentation method has the characteristics of small calculation amount, simple operation and good segmentation effect. Its basic principle is based on the grayscale difference between the target object and the background, using the grayscale boundary value between the two as the threshold, and then using the threshold to classify the pixels in the imagery one by one to separate the target object from the background. In CimageA, users can choose to achieve plant segmentation in multiple different channels such as ExR (excess-red), ExG (excess-green), R, G, and B by clicking “Segmentation Method” (Figure 6).

2.3.4. Remote Sensing Eigenvalue Extraction

The types of imagery that CimageA allows to import include high-resolution orthophoto, multi-channel spectral reflectance images and DSM image. After the AOI is drawn, CimageA will automatically extract 164 remote sensing eigenvalues of the selected area from the three levels of image, shape and spectrum. Click “Data Processing—Save Data” and the data will be exported in the form of a self-named file and saved in Excel.
The extracted remote sensing eigenvalues include canopy texture features of each AOI extracted based on high-definition visible light images, plant canopy coverage, color, vegetation index (VIs) and other features calculated based on visible light and multispectral images, and upper canopy features extracted based on the DSM image. Texture features are generated using Gray level co-occurrence matrices (GLCM). GLCM can reflect comprehensive information about the direction, adjacent intervals, and change amplitudes of image gray levels, and reflect the rules of image pixels and structures, including nine categories, such as energy, inertia, entropy, correlation, etc. Quantitative color analysis is an important aspect of crop phenotyping analysis and is used in various common applications (such as maturity analysis [24], chlorophyll content indication, leaf color analysis [25], variety identification [26], etc.) to provides more valuable information. The color features in the software are extracted from RGB, HSV, and Lab color spaces. VIs are formed by linearly or nonlinearly combining the reflectance of different bands. Canopy coverage refers to the vertical projection of the vegetation canopy in the observation area and the ratio of the pixel area to the total soil pixel area.

2.3.5. Model Construction and Evaluation

Solar radiation shines on plant leaves, and most of the blue and red bands are absorbed by the leaves for photosynthesis, while other bands (such as green bands and infrared bands) return to the atmosphere in the form of reflected light, and a small amount is emitted into the atmosphere in the form of fluorescence. By capturing the information of the plant reflection spectrum, the relationship model between leaf reflection imaging characteristic parameters and plant phenotypic traits can be constructed, which can be used to comprehensively analyze the phenotypic traits of crops, including external morphological structure, color, photosynthetic efficiency, physiological and ecological characteristics, and so on.
CimageA embeds a variety of models for crop phenotype inversion, including general linear model (GLM), robust linear model (RLM), support vector machine (SVM), SVM-linear, SVM-kernel, gaussian process model (GPM), random forest (RF), decision tree (DT), generalized additive model (GAMS), and neural network model (NNM).
In operation, the measured phenotypic data are imported into the generated table of “remote sensing eigenvalue extraction results” according to the corresponding plot serial number. Attention should be paid to the one-to-one correspondence between the phenotypic data and the remote sensing eigenvalues, after which the table document is saved and an appropriate model is selected for training inversion. Users can set the parameters of the inversion model by themselves, including the proportion of training/verification samples [27].
The coefficient of determination (R2), p-value, root mean square error (RMSE), relative root mean-squared error (RRMSE) and relative error rate (RE) are used as evaluation indexes of the estimation model. R2, also known as goodness of fit, means that the variation part of the dependent variable can be explained according to the variation of the independent variable. RMSE is the square root of the ratio of the square of the deviation between the predicted value and the real value and the number of observations, n. The larger the R2 value of the model, the smaller the RMSE value, indicating that its prediction ability is better.
R 2 = i = 1 n ( y i ^ y - i ) 2 i = 1 n ( y i y - i ) 2
RMSE = 1 n i = 1 n ( y ^ i y i ) 2
R R M S E = R M S E y - i
R E = ( y i y - i ) y - i
where y i is the measured value, y ^ i is the predicted value, y ^ i is the average value of the measured value, and n is the number of samples.

2.3.6. Large Scale Image Processing

Remote sensing imagery contains rich spectral features and numerous pixels, which leads to a huge amount of calculation. Image down-sampling is an image processing method used to reduce the number of pixels in an image and convert a high-resolution image into a low-resolution image, thus improving the operation speed and efficiency. The software uses the Gaussian filtering method to down-sample the original image.

2.3.7. Running Progress Display

It often takes a long time to extract remote sensing eigenvalues or train models. To this end, CimageA provides a progress bar to make the product more “humanized”. The progress bar can reflect the progress of the task, prompt the start and completion of the operation, and reduce the anxiety of users when waiting.

2.4. Performance Validation of CimageA

2.4.1. Extraction of Ramie Leaf Color Features

Ramie leaf color is a key content of ramie germplasm resource identification. In this study, ramie leaf color was classified into “1, 2, 3” levels, corresponding to yellow–green, green and dark green, according to the rules in the “Specification for the description of ramie resources and data standards” [28]. On 7 May 2023, the remote sensing imagery of ramie canopy in the test area was acquired, and the leaf color classes of 36 ramie germplasm resources were manually defined according to the rules. CimageA was used to extract the color eigenvalues of 36 ramie materials from the remote sensing imagery, and then we analyzed the differences in the color eigenvalues of ramie with different leaf color classes. In addition, based on the variety characteristic and the experience of experts, three varieties with typical differences in leaf color were selected for validation testing, namely, Dazhu ramie, Xiangzhu 7, and Changshun ramie. Among them, the leaf color of Dazhu ramie is yellow, the leaf color of Xiangzhu 7 is green, and the leaf color of the Changshun ramie is dark green (Figure 7).

2.4.2. Extraction of Ramie Leaf Area Index

Leaf area index (LAI) is an important phenotypic index to characterize the growth, nutritional status and photosynthetic capacity of ramie [29,30], which plays a crucial role in evaluating the growth and yield of ramie. However, the existing LAI measurement method is still at the stage of manual collection, which has many shortcomings such as being time-consuming, costly, and inefficient. Therefore, to provide an efficient and accurate method for obtaining ramie LAI in the field, and to verify the practicability of the remote sensing eigenvalues extracted by CimageA in estimating ramie-related physical and chemical properties, we added a module of “Model construction and evaluation” to CimageA to help users solve the remote sensing measurement problems of crops related physical and chemical properties in a one-stop manner.
Canopy remote sensing images and the measured LAI of 36 ramie germplasm resources in five periods in the test area were collected from March to May 2023 (a total of 360 sample data were obtained). Firstly, CimageA was used to extract the remote sensing eigenvalues of ramie in different periods, and then Pearson correlation analysis was used to analyze the relationship between various remote sensing eigenvalues and ramie LAI. Finally, a suitable algorithm was selected to construct the ramie LAI estimation model. During the modeling process, 70% of the samples were randomly selected as the training set of the model, and the other 30% of the samples were used as the validation set to evaluate the performance of the model.

2.4.3. Extraction of Ramie Plant Height

The visible light camera carried by the UAV can provide the height information of the crop canopy surface. Therefore, the crop plant height can be calculated using the difference between the digital surface model (DSM) of the upper canopy and the digital terrain model (DTM) (Figure 8). Based on the DSM images obtained on 13 March, the lowest elevations in the AOIs were taken as the surface height of each plot. CimageA was used to extract the average DSM value of each plot, and the measured plant height of each plot was used for fitting evaluation. During the measurement of plant height, ten ramie plants were randomly selected in each plot to measure the distance from the bottom of the plant to the top of the canopy with a ruler, and the average value was taken as the plant height.

2.4.4. Extraction of Crop Planting Area

Crop area extraction is of great significance for land allocation, agricultural insurance claims, crop yield estimation, etc. UAV remote sensing technology enables the accurate extraction of crop-planted or disaster-affected areas, and CimageA offers a fast processing tool for crop area extraction using UAV remote sensing images. When UAV executes the photography mission, an object with standard length and area is placed in the test area, and the size information of the object is then inputted through “length correction/area correction” in CimageA to correct the ground resolution of the image pixel. In order to achieve the extraction of the crop planting area, a vector polygon is drawn on the remote sensing imagery. Each measured area corresponds to a vector polygon and the area is automatically calculated according to the coverage of the corresponding vector polygon. In March 2020, a UAV remote sensing system was used to collect remote sensing imagery of the test area, and the sizes of ramie, jute and cabbage planting areas and all planting areas were measured (Figure 9). After area correction by CimageA, the area factor is 1.973.

3. Results

3.1. Verify the Ramie Color Features Extracted by CimageA

Figure 10 presents the quantitative color features of the ramie with different leaf color classes. The results show that the RGB color channel can best describe the color information of ramie canopy leaves, and ramie with leaf color of yellow–green (class 1) is easier to distinguish, because it has less overlap with green and dark green ramie leaves in the feature values. To further verify the color features extracted by CimageA, three ramie varieties (Dazhu ramie, Xiangzhu 7 and Changshun ramie) were selected for quantitative leaf color evaluation, because these three varieties had obvious differences in leaf color. As shown in Figure 11, the red component of Dazhu ramie is higher. According to the principle of adding and mixing colors of the three primary colors, yellow is formed by mixing red and green, so the higher the red component, the more yellow will be present. Additionally, Duzhu ramie has a higher value in the V-channel, indicating that the canopy leaves are brighter. Compared with the Dazhu ramie, although the Changshun ramie shows green characteristics in the H-channel, the color tends to be dark green due to its low brightness and saturation.

3.2. Verify the Phenotype Inversion of CimageA

To verify the ability of CimageA to estimate crop phenotypic traits, we took ramie LAI as the inversion target and constructed a general model applicable to different growth periods based on 360 sample data obtained from five different periods. Remote sensing eigenvalues of the target AOI were extracted by using CimageA, and the correlation between remote sensing eigenvalues and LAI was analyzed (Figure 12). The results of the Pearson correlation analysis indicate that many remote sensing eigenvalues were significantly correlated with LAI (p < 0.01) or correlated with LAI (p < 0.05). In addition, we utilized the measured LAI and the corresponding remote sensing eigenvalues as model inputs, and constructed the ramie LAI estimation model by using ten algorithms, including GLM, RLM, SVM, SVM-linear, SVM-kerne, GPM, RF, DT, GAMS and NNM. During the modeling process, the ratio of training set to validation set was randomly divided into 7:3. As can be seen from Table 1, the R2 of the training set model ranges from 0.637 to 1, with RMSE varying from 0 to 1.518, and the R2 of the verification set model ranges from 0.003 to 0.796, with an RMSE ranging from 0.684 to 1.512. Among the ten models, the ramie LAI estimation model constructed by NNM has high accuracy and stable performance, indicating that CimageA can be used to estimate crop-related phenotypic traits.

3.3. Verify the Ramie Plant Height Extracted by CimageA

Plant height is an important index for crop growth monitoring. CimageA was used to extract ramie plant height on the plot scale, and the estimated plant height derived from UAV imagery was compared with the measured plant height. As shown in Figure 13, there was an obvious linear correlation between the estimated plant height and the measured plant height (R2 = 0.989). There was an acceptable deviation between the two, with the RMSE being 10.654 cm. The constructed time-series dataset of the estimated plant height reveals the change rule of ramie plant height in the time dimension (Figure 14). Ramie plant height increased slowly in the early and late growth stages, while it increasing rapidly during the closure stage and prosperous stage, which were also the key periods for the formation of differences in the plant height of ramie germplasm resources. Plants with better growth will get more light, heat, air, etc., so as to give full play to their growth advantages after the closure stage. Therefore, ramie varieties with growth potential can be screened by comparing the changes of plant height during this period.

3.4. Verify the Crop Area Extracted by CimageA

To evaluate the performance of CimageA in extracting crop areas, three crops with different planting areas were selected for validation (Figure 15). As shown in the figure, CimageA can accurately extract the planting areas of different crops. The measured planting area of ramie is 176.170 m2, and the identified area of ramie by CimageA is 178.871 m2, with an error of 2.702 m2. The measured planting area of jute is 18.250 m2, and the identified area of jute recognized by CimageA is 17.947 m2, with an error of 0.303 m2. The measured planting area of cabbage is 8.760 m2, and CimageA recognizes cabbage as 8.758 m2, with an error of 0.001 m2.

4. Discussion

In agriculture, large-scale production activities and breeding programs require the phenotypic analysis of crop germplasm resources to provide better management and decision-making, and to aid in the selection of high-quality varieties. Currently, crop phenotype is a bottleneck in accelerating the crop breeding process and breeding high-quality and high-yield crop varieties [31]. CimageA was developed as an effective tool for UAV-platform-based crop phenotype data acquisition and analysis. It mainly solves the problems of the complex extraction steps for remote sensing eigenvalues, and the difficulty of the construction of an inversion model and visualization analysis in the process of remote sensing monitoring. With the help of this software, agricultural workers can extract a large number of plot-level phenotype data from multi-source UAV images, and these data are visualized, which not only helps to analyze the specific genes of different crop varieties, but also helps to observe the phenotype changes of crops in different environments, and guides production activities.
In terms of remote sensing eigenvalue extraction, due to the complexity of calculation steps, previous studies usually calculated a few remote sensing eigenvalues for model training, and the common eigenvalues are NDVI (normalized difference vegetation index), GNDVI (Green-NDVI), NGRDI (normalized green-red difference index) [32], etc., as these VIs are closely correlated with phenotypic traits such as crop greenness and biomass. Texture features have also been widely used in recent years due to their ability to weaken the influence of spectral saturation and improve the accuracy of crop phenotypic trait inversion [33]. Although the commonly used indicators mentioned above are of great value in the estimation of certain crop traits, the most sensitive features reflecting crop growth often differ due to the influence of various factors such as crop varieties, growing environments, and phenological periods. For example, as early as 1992, some scholars proposed that crop LAI was linearly correlated with vegetation indices RVI (ratio vegetation index) and TSAVI (transformed soil adjustive vegetation index), while it is more suitable to be fitted with GVI (Greenness vegetation index) and PVI (perpendicular vegetation index) using quadratic equation fitting. Therefore, one of the objectives of developing CimageA is to realize the one-button extraction of remote sensing eigenvalues and to extract richer and more comprehensive remote sensing eigenvalues as much as possible, because there is still potentially critical information in these eigenvalues that is significantly correlated with crop phenotypes. With CimageA, users can achieve feature extraction at three levels: graphical, morphological, and spectral. The number of extracted remote sensing eigenvalues reaches 164. These color, texture, and spectral features can not only be used for simple identification or classification processes, but also have important significance for the quantitative estimation of phenotypic traits.
To further promote the practical application of remote sensing eigenvalues extracted by CimageA, we developed a module for crop phenotype inversion modeling based on traditional regression algorithms and machine learning algorithms, and verified the practicality of this function by estimating ramie LAI. The results show that there was a significant correlation between the remote sensing eigenvalues derived from CimageA and ramie LAI. The constructed ramie LAI estimation model based on the neural network algorithm could estimate the LAI of different genotypes of ramie, with R2 of 0.796 and RMSE of 0.684. During the construction of the phenotype inversion model, all the data were supplied from CimageA-derived eigenvalues, and the final accuracy of the model was determined by various factors such as the correlation between the target phenotypic traits and remote sensing eigenvalues, the modeling strategy, the number of model samples, etc., and is not affected by the CimageA software. In order to ensure the accuracy and stability of the constructed model, users can provide more samples and choose a more appropriate model.
In addition, CimageA also focused on the need to measure crop acreage in the process of production management. CimageA obtained high accuracy in the area measurement of ramie, jute and cabbage due to the addition of the area correction function. In this study, although we only verified crop acreage extraction on a relatively small scale, we believe that this function has greater value in large-scale area surveys, agricultural insurance claims, and crop yield estimation.
In practical breeding programs, it is meaningful to display the information of population phenotypic distribution using visualization technology. Previously, breeding a new variety could take 10 years or more, because of the need to screen thousands of resources for phenotypic superiority, and frequent surveys after one generation of hybridization were also required to record changes in crop field growth phenotypes. By visualizing the results of phenotypic analysis, breeders can quickly identify the specific varieties in the screening process, and monitor the differences and diversity of population phenotypes.
CimageA software includes three functions—image preprocessing, remote sensing eigenvalue extraction and data analysis. However, it lacks the capacity to produce orthophoto. In the crop phenotype monitoring research based on UAV remote sensing, orthophoto generation is a key step that cannot be skipped after image acquisition. A large number of field series images would be obtained in the UAV remote sensing task, and these images need to be spliced through complex algorithms to form orthophoto images that can reflect the global information of the field. At present, orthophoto generation mainly relies on specialized software, such as Pix4Dmapper, DJI Terra, etc., so the lack of orthophoto generation is one of the limitations of CimageA software. In addition, CimageA only supports multispectral image processing. Our team will continue to develop software to help extract data from hyperspectral, radar, thermal infrared and other sensors.

5. Conclusions

CimageA provides strong support for crop remote sensing data extraction and crop phenotype analysis based on UAV images, and can efficiently quantify crop biological traits. In this study, 164 plot-level remote sensing eigenvalues generated by CimageA were used to quantitatively grade the color of ramie canopy leaves, and the result can assist in the identification of ramie varieties, phenology period and growth monitoring. When we combined these remote sensing eigenvalues with machine learning algorithms to construct the ramie LAI estimation model, CimageA could effectively realize the nondestructive evaluation of crop phenotypes. CimageA is also of great value in measuring plant height and planting area. Therefore, we believe that this work offers a valuable exploration and progress in the field of using UAV for crop phenotypic analysis, which supports the selection of high-quality crop varieties and the acceleration of crop breeding processes.

Author Contributions

J.L. (Jinwei Li) and H.F. conceived and designed this study. J.L. (Jianning Lu), J.N. and W.W. performed the implementation and analyzed the results. J.L. (Jinwei Li), G.C. and W.S. were major contributors to writing the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Ministry of Finance and Ministry of Agriculture and Rural Affairs: National Modern Agricultural Industry Technology System (CARS-16-E11), National Key Research and Development Program Project (2018YFD0201106), National Natural Science Foundation of China (31471543).

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding authors.

Acknowledgments

The authors thank Hunan Agricultural University (Hunan, China) for technical support and the material used for the experiment.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Tester, M.; Langridge, P. Breeding technologies to increase crop production in a changing world. Science 2010, 327, 818–822. [Google Scholar] [CrossRef] [PubMed]
  2. Furbank, R.T.; Tester, M. Phenomics technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef] [PubMed]
  3. Wang, X.; Qiu, L.; Jing, R.; Ren, G.; Li, Y.; Li, C.; Qin, P.; Gu, Y.; Li, L. Evaluation on phenotypic traits of crop germplasm: Status and development. J. Plant Genet. Resour. 2022, 23, 12–20. [Google Scholar]
  4. Persa, R.; Ribeiro, P.; Jarquin, D. The use of high-throughput phenotyping in genomic selection context. Crop Breed. Appl. Biotechnol. 2021, 21, e385921S6. [Google Scholar] [CrossRef]
  5. Li, D.; Quan, C.; Song, Z.; Li, X.; Yu, G.; Li, C.; Muhammad, A. High- throughput plant phenotyping platform (HT3P) as a novel tool for estimating agronomic traits from the lab to the field. Front. Bioeng. Biotechnol. 2021, 8, 623705. [Google Scholar] [CrossRef]
  6. Liu, J.; Zhao, C.; Yang, G.; Yu, H.; Zhao, X.; Xu, B.; Niu, Q. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform. Trans. Chin. Soc. Agric. Eng. 2016, 32, 98–106. [Google Scholar]
  7. Cabrera, B.L.; Crossa, J.; Von, Z.J.; Serret, M.D.; Araus, J.L. High-throughput phenotyping and genomic selection: The frontiers of crop breeding converge. J. Integr. Plant Biol. 2012, 54, 312–320. [Google Scholar] [CrossRef]
  8. Santana, D.C.; De Oliveira Cunha, M.P.; Dos Santos, R.G.; Cotrim, M.F.; Teodoro, L.P.R.; Da Silva Junior, C.A.; Baio, F.H.R.; Teodoro, P.E. High-throughput phenotyping allows the selection of soybean genotypes for earliness and high grain yield. Plant Methods 2022, 18, 13. [Google Scholar] [CrossRef]
  9. Kim, J.Y. Roadmap to high throughput phenotyping for plant breeding. J. Biosyst. Eng. 2020, 45, 43–55. [Google Scholar] [CrossRef]
  10. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef]
  11. Jangra, S.; Chaudhary, V.; Yadav, R.C.; Yadav, N.R. High-throughput phenotyping: A platform to accelerate crop improvement. Phenomics 2021, 1, 31–53. [Google Scholar] [CrossRef] [PubMed]
  12. Hu, W.; Ling, H.; Fu, X. Development and application of the plant phenomics analysis platform. Genetics 2019, 41, 1060–1066. [Google Scholar]
  13. Fahlgren, N.; Feldman, M.; Gehan, M.A.; Wilson, M.S.; Shyu, C.; Bryant, D.W.; Hill, S.T.; McEntee, C.J.; Warnasooriya, S.N.; Kumar, I.; et al. A versatile phenotyping system and analytics Platform reveals diverse temporal responses to water availability in setaria. Mol. Plant 2015, 8, 1520–1535. [Google Scholar] [CrossRef] [PubMed]
  14. Van Tassel, D.L.; DeHaan, L.R.; Diaz-Garcia, L.; Hershberger, J.; Rubin, M.J.; Schlautman, B.; Turner, K.; Miller, A.J. Re-imagining crop domestication in the era of high throughput phenomics. Curr. Opin. Plant Biol. 2022, 65, 102150. [Google Scholar] [CrossRef] [PubMed]
  15. Yang, W.; Doonan, J.H.; Hawkesford, M.J.; Pridmore, T.; Zhou, J. Editorial: State-of-the-art technology and applications in crop phenomics. Front. Plant Sci. 2021, 12, 2226. [Google Scholar] [CrossRef]
  16. Sun, G.; Lu, H.; Zhao, Y.; Zhou, J.; Jackson, R.; Wang, Y.; Xu, L.X.; Wang, A.; Colmer, J.; Ober, E.; et al. AirMeasurer: Open-source software to quantify static and dynamic traits derived from multiseason aerial phenotyping to empower genetic mapping studies in rice. New Phytol. 2022, 236, 1584–1604. [Google Scholar] [CrossRef]
  17. Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop phenomics and high-throughput phenotyping: Past decades, current challenges, and future perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef]
  18. Guo, Y.; Chen, S.; Li, X.; Cunha, M.; Jayavelu, S.; Cammarano, D.; Fu, Y. Machine learning-based approaches for predicting spad values of maize using multi-spectral images. Remote Sens. 2022, 14, 1337. [Google Scholar] [CrossRef]
  19. Pan, F.J.; Guo, J.K.; Miao, J.C.; Xu, H.Y.; Tian, B.Q.; Gong, D.C.; Zhao, J.; Lan, Y.B. Summer maize LAI retrieval based on multi-source remote sensing data. Int. J. Agric. Biol. Eng. 2023, 16, 179–186. [Google Scholar] [CrossRef]
  20. Riccardi, M.; Mele, G.; Pulvento, C.; Lavini, A.; d’Andria, R.; Jacobsen, S.E. Non-destructive evaluation of chlorophyll content in quinoa and amaranth leaves by simple and multiple regression analysis of RGB image components. Photosynth. Res. 2014, 120, 263–272. [Google Scholar] [CrossRef]
  21. Wright, H.C.; Lawrence, F.A.; Ryan, A.J.; Cameron, D.D. Free and open-source software for object detection, size, and colour determination for use in plant phenotyping. Plant Methods 2023, 19, 126. [Google Scholar] [CrossRef] [PubMed]
  22. Wang, B.; Yang, C.; Zhang, J.; You, Y.; Wang, H.; Yang, W. IHUP: An integrated high-throughput universal phenotyping software platform to accelerate unmanned-aerial-vehicle-based field plant phenotypic data extraction and analysis. Plant Phenomics 2024, 6, 0164. [Google Scholar] [CrossRef] [PubMed]
  23. Tu, K.; Wu, W.; Cheng, Y.; Zhang, H.; Xu, Y.; Dong, X.; Wang, M.; Sun, Q. AIseed: An automated image analysis software for high-throughput phenotyping and quality non-destructive testing of individual plant seeds. Comput. Electron. Agric. 2023, 27, 107740. [Google Scholar] [CrossRef]
  24. Mhaski, R.R.; Chopade, P.B.; Dale, M.P. Determination of ripeness and grading of tomato using image analysis on Raspberry Pi. In Proceedings of the 2015 Communication, Control and Intelligent Systems (CCIS), Mathura, India, 7–8 November 2015; Volume 2016, pp. 214–220. [Google Scholar]
  25. Wu, X.M.; Zhang, F.G.; Lu, J.T. Research on recognition of tea tender leaf based on image color information. J. Tea Sci. 2013, 33, 584–589. [Google Scholar]
  26. Cui, D.D.; Cui, G.X.; Yang, R.F.; She, W.; Liu, Y.; Wang, H.; Su, X.; Wang, J.; Liu, W.; Wang, X.; et al. Phenotypic characteristics of ramie (Boehmeria nivea L) germplasm resources based on UAV remote sensing. Genet. Resour. Crop Evol. 2021, 68, 551–566. [Google Scholar] [CrossRef]
  27. Walter, J.; Edwards, J.; Cai, J.; McDonald, G.; Miklavcic, S.J.; Kuchel, H. High-Throughput field imaging and basic image analysis in a wheat breeding programme data. Front. Plant Sci. 2019, 10, 449. [Google Scholar] [CrossRef]
  28. Jie, Y.C.; Xu, Y.; Sun, Z.M.; Chen, J.F.; Xing, H.C.; She, W.; Cai, S.W.; Wang, X.F.; Qin, Z.J.; Luo, Z.Q. Description and Data Standard of Ramie Germplasm Resources; China Agriculture Press: Beijing, China, 2007; pp. 1–3. [Google Scholar]
  29. Fu, H.Y.; Lu, J.N.; Chen, J.F.; Wang, W.; Cui, G.; She, W. Influence of structure and texture feature on retrieval of ramie leaf area index. Agronomy 2023, 13, 1690. [Google Scholar] [CrossRef]
  30. Zhang, H.; Liu, W.; Han, W.; Liu, Q.; Song, R.; Hou, G. Inversion of summer maize leaf area index based on gradient boosting decision tree algorithm. Trans. Chin. Soc. Agric. Mach. 2019, 50, 258–266. [Google Scholar]
  31. Wanga, M.A.; Shimelis, H.; Mashilo, J.; Laing, M.D. Opportunities and challenges of speed breeding: A review. Plant Breed. 2021, 140, 185–194. [Google Scholar] [CrossRef]
  32. Wan, L.; Du, X.; Chen, S.; Yu, F.; Zhu, J.; Xu, T.; He, Y.; Cen, H. Rice panicle phenotyping using UAV-based multi-source spectral image data fusion. Trans. Chin. Soc. Agric. Eng. 2022, 38, 162–170. [Google Scholar]
  33. Sun, B.; Wang, C.F.; Yang, C.H.; Xu, B.; Zhou, G.; Li, X.; Xie, J.; Xu, S.; Liu, B.; Xie, T.; et al. Retrieval of rapeseed leaf area index using the PROSAIL model with canopy coverage derived from UAV images as a correction parameter. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102373. [Google Scholar]
Figure 1. Plots’ distribution and multi-channel remote sensing images of ramie test area. Result_RGB is the highest-resolution visible image of the test area, Result_Blue shows the blue channel spectral reflectance image, Result_Green shows the green channel spectral reflectance image, Result_Red shows the red channel spectral reflectance image, Result_RedEdge shows the red edge channel spectral reflectance image and Result_Nir is the spectral reflectance image of the near infrared channel.
Figure 1. Plots’ distribution and multi-channel remote sensing images of ramie test area. Result_RGB is the highest-resolution visible image of the test area, Result_Blue shows the blue channel spectral reflectance image, Result_Green shows the green channel spectral reflectance image, Result_Red shows the red channel spectral reflectance image, Result_RedEdge shows the red edge channel spectral reflectance image and Result_Nir is the spectral reflectance image of the near infrared channel.
Agronomy 14 02534 g001
Figure 2. User interface and instructions of CimageA. Numbers in the figure provide guidance: (1) File processing. (2) Image processing and parameter settings. (3) Data processing. (4) Data analysis. (5) Phenotypic visualization.
Figure 2. User interface and instructions of CimageA. Numbers in the figure provide guidance: (1) File processing. (2) Image processing and parameter settings. (3) Data processing. (4) Data analysis. (5) Phenotypic visualization.
Agronomy 14 02534 g002
Figure 3. The operation process of CimageA.
Figure 3. The operation process of CimageA.
Agronomy 14 02534 g003
Figure 4. Tilt correction of CimageA. (A) Image before tilt correction. (B) Image after tilt correction.
Figure 4. Tilt correction of CimageA. (A) Image before tilt correction. (B) Image after tilt correction.
Agronomy 14 02534 g004
Figure 5. Intelligent drawing of the AOI. (A) Plots divided by diagonal zoning. (B) Plots divided by hypotenuse zoning.
Figure 5. Intelligent drawing of the AOI. (A) Plots divided by diagonal zoning. (B) Plots divided by hypotenuse zoning.
Agronomy 14 02534 g005
Figure 6. The result of ramie plant segmentation based on ExR. (A) Original image. (B) Grayscale image in ExR channel. (C) The segmented image.
Figure 6. The result of ramie plant segmentation based on ExR. (A) Original image. (B) Grayscale image in ExR channel. (C) The segmented image.
Agronomy 14 02534 g006
Figure 7. Three ramie varieties with different leaf color. (A) Dazhu ramie. (B) Xiangzhu 7. (C) Changshun ramie.
Figure 7. Three ramie varieties with different leaf color. (A) Dazhu ramie. (B) Xiangzhu 7. (C) Changshun ramie.
Agronomy 14 02534 g007
Figure 8. Extraction of ramie plant height based on UAV remote sensing images.
Figure 8. Extraction of ramie plant height based on UAV remote sensing images.
Agronomy 14 02534 g008
Figure 9. Crop planting area measurement. Area 1 (red area) is the ramie planting area, area 2 (green area) is the jute planting area, area 3 (blue area) is the cabbage planting area, and area 4 includes all planting areas.
Figure 9. Crop planting area measurement. Area 1 (red area) is the ramie planting area, area 2 (green area) is the jute planting area, area 3 (blue area) is the cabbage planting area, and area 4 includes all planting areas.
Agronomy 14 02534 g009
Figure 10. Quantitative color features of the ramie with different leaf color classes.
Figure 10. Quantitative color features of the ramie with different leaf color classes.
Agronomy 14 02534 g010
Figure 11. Quantitative leaf color evaluation of three ramie varieties.
Figure 11. Quantitative leaf color evaluation of three ramie varieties.
Agronomy 14 02534 g011
Figure 12. Correlation analysis between ramie LAI and remote sensing eigenvalues.
Figure 12. Correlation analysis between ramie LAI and remote sensing eigenvalues.
Agronomy 14 02534 g012
Figure 13. Relationship between the measured plant height and the estimated plant height.
Figure 13. Relationship between the measured plant height and the estimated plant height.
Agronomy 14 02534 g013
Figure 14. Temporal changes of the estimated plant height.
Figure 14. Temporal changes of the estimated plant height.
Agronomy 14 02534 g014
Figure 15. Precision of crop area extracted by CimageA. ** indicates a significant correlation.
Figure 15. Precision of crop area extracted by CimageA. ** indicates a significant correlation.
Agronomy 14 02534 g015
Table 1. Estimation accuracy of ramie LAI estimation model.
Table 1. Estimation accuracy of ramie LAI estimation model.
ModelDatasetRMSERRMSERERESTD
GLMTraining set0.3827.5986.3405.692
Validation set0.92417.67814.65415.972
RLMTraining set0.4017.9146.5626.440
Validation set0.84116.34413.97913.472
SVMTraining set0.67013.35910.9558.774
Validation set0.78614.93113.52612.450
SVM-linearTraining set1.51829.75424.57712.768
Validation set1.48429.30622.14513.245
SVM-kerneTraining set0.75414.78512.70518.913
Validation set1.51229.83530.58230.170
GPMTraining set0.57911.2369.7527.628
Validation set0.90318.24115.21914.150
RFTraining set0.0000.0000.0000.000
Validation set1.00719.26916.45515.856
DTTraining set0.0000.0000.0000.000
Validation set0.97718.51415.39614.828
GAMSTraining set0.0000.0000.0000.000
Validation set1.19624.20121.44818.793
NNMTraining set0.67013.04911.0299.410
Validation set0.68413.72212.45811.499
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fu, H.; Lu, J.; Cui, G.; Nie, J.; Wang, W.; She, W.; Li, J. Advanced Plant Phenotyping: Unmanned Aerial Vehicle Remote Sensing and CimageA Software Technology for Precision Crop Growth Monitoring. Agronomy 2024, 14, 2534. https://doi.org/10.3390/agronomy14112534

AMA Style

Fu H, Lu J, Cui G, Nie J, Wang W, She W, Li J. Advanced Plant Phenotyping: Unmanned Aerial Vehicle Remote Sensing and CimageA Software Technology for Precision Crop Growth Monitoring. Agronomy. 2024; 14(11):2534. https://doi.org/10.3390/agronomy14112534

Chicago/Turabian Style

Fu, Hongyu, Jianning Lu, Guoxian Cui, Jihao Nie, Wei Wang, Wei She, and Jinwei Li. 2024. "Advanced Plant Phenotyping: Unmanned Aerial Vehicle Remote Sensing and CimageA Software Technology for Precision Crop Growth Monitoring" Agronomy 14, no. 11: 2534. https://doi.org/10.3390/agronomy14112534

APA Style

Fu, H., Lu, J., Cui, G., Nie, J., Wang, W., She, W., & Li, J. (2024). Advanced Plant Phenotyping: Unmanned Aerial Vehicle Remote Sensing and CimageA Software Technology for Precision Crop Growth Monitoring. Agronomy, 14(11), 2534. https://doi.org/10.3390/agronomy14112534

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop