Next Article in Journal
Optical Fiber Cladding SPR Sensor Based on Core-Shift Welding Technology
Next Article in Special Issue
Phenotyping of Plant Biomass and Performance Traits Using Remote Sensing Techniques in Pea (Pisum sativum, L.)
Previous Article in Journal
Cylindrical Dielectric Resonator Antenna-Based Sensors for Liquid Chemical Detection
Previous Article in Special Issue
Using a Portable Active Sensor to Monitor Growth Parameters and Predict Grain Yield of Winter Wheat
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies

1
College of electrical and information, Heilongjiang Bayi Agricultural University, Daqing 163319, China
2
Agronomy College of Heilongjiang Bayi Agricultural University, Daqing 163319, China
3
Key Laboratory of Modern Precision Agriculture System Integration Research, Ministry of Education, China Agricultural University, Beijing 100083, China
*
Authors to whom correspondence should be addressed.
Sensors 2019, 19(5), 1201; https://doi.org/10.3390/s19051201
Submission received: 2 February 2019 / Revised: 4 March 2019 / Accepted: 5 March 2019 / Published: 8 March 2019
(This article belongs to the Special Issue Advanced Sensor Technologies for Crop Phenotyping Application)

Abstract

:
A reasonable plant type is an essential factor for improving canopy structure, ensuring a reasonable expansion of the leaf area index and obtaining a high-quality spatial distribution of light. It is of great significance in promoting effective selection of the ecological breeding index and production practices for maize. In this study, a method for calculating the phenotypic traits of the maize canopy in three-dimensional (3D) space was proposed, focusing on the problems existing in traditional measurement methods in maize morphological structure research, such as their complex procedures and relatively large error margins. Specifically, the whole maize plant was first scanned with a FastSCAN hand-held scanner to obtain 3D point cloud data for maize. Subsequently, the raw point clouds were simplified by the grid method, and the effect of noise on the quality of the point clouds in maize canopies was further denoised by bilateral filtering. In the last step, the 3D structure of the maize canopy was reconstructed. In accordance with the 3D reconstruction of the maize canopy, the phenotypic traits of the maize canopy, such as plant height, stem diameter and canopy breadth, were calculated by means of a fitting sphere and a fitting cylinder. Thereafter, multiple regression analysis was carried out, focusing on the calculated data and the actual measured data to verify the accuracy of the calculation method proposed in this study. The corresponding results showed that the calculated values of plant height, stem diameter and plant width based on 3D scanning were highly correlated with the actual measured data, and the determinant coefficients R2 were 0.9807, 0.8907 and 0.9562, respectively. In summary, the method proposed in this study can accurately measure the phenotypic traits of maize. Significantly, these research findings provide technical support for further research on the phenotypic traits of other crops and on variety breeding.

1. Introduction

The selection and breeding of excellent maize varieties has attracted extensive international attention [1]. Many influential research centers, such as the International Maize and Wheat Improvement Center (CIMMYT) [2] and the International Center for Agricultural Research in the Dry Areas (ICARDA) [3] have been working on genetic diversity and cultivation of maize plants. Maize is also one of the most important food crops in China, as well as one of the most important industrial raw materials and economic crops. Deep processing of maize can be widely applied in food, medicine, bioenergy and more than 20 other industries. The development of the Chinese maize industry is inseparable from maize breeding. However, with the increases in drought, diseases and insect pests, the resistance of new maize varieties must be improved further [4]. Thus, maize breeding research faces a series of challenges [5].
The crop phenotype refers to the physical, physiological and biochemical characteristics and properties of crops that are determined or influenced by genetic and environmental factors during their growth and development [6,7,8]. Accurate and rapid acquisition of plant phenotypic information will provide theoretical and technical support for promoting the development of crop science, thus ensuring world food security, ecological security and sustainable agricultural development.
Plant height is a vital phenotypic trait for characterizing plant growth and is one of the most basic indicators in plant morphological investigations. Plant height is defined as the distance from the base of the plant to the upper boundary of the main photosynthetic tissues (excluding inflorescences) [9,10]. The maintenance of plant height at a relatively uniform horizontal height in the field indicates a well-distributed nutrient supply of fertilizer in the field, which is conducive to processes such as photosynthesis and pollination. Stem diameter is another important trait reflecting plant phenotype [11,12]. In addition to its roles in water and nutrient transport, the stem can store nutrients and can transport nutrients to grains in later stages. Furthermore, the stem is the organ that produces and supports spikes, exhibiting phototropism and negative geotropism [13]. When the plant lodges, it can bend upward to maintain growth, so that the plant can become erect again to reduce losses. The thickness of the stem is positively correlated with lodging resistance [10], as well as the ability to store and transport water and fertilizer, resulting in a greater amount of nutrients being transported to the grain. Additionally, canopy breadth is the average width of the plant canopy, which not only determines the distribution of light in the plant canopy but also serves as an index for evaluating the efficiency and effectiveness of agricultural management in relation to fertilization, irrigation, thinning and harvesting [14,15]. Therefore, monitoring the changes in plant height, stem diameter and canopy breadth in different periods can enable agronomists and breeders to keep abreast of plant health and growth in a timely manner.
With the development of agricultural mechanization, as well as modern farmland cultivation and management technology, significant improvements in agricultural production technology have been achieved [16]. Revolutionary changes in maize science research have been obtained during the development of modern biotechnology [17]. However, traditional methods for the acquisition of phenotypic traits, such as plant height, stem diameter and canopy width, are still carried out through manual measurement with a ruler or measuring tape, which is labor-intensive, inefficient and inaccurate and limits the development of modern maize science [18,19]. In view of the current status, it is urgent to develop non-destructive and accurate means of detection of phenotypic traits to reduce labor, improve efficiency and promote the rapid development of maize science [20,21]. In this context, informatization has been widely applied to the study of maize phenotypic traits [22].
One important means of studying the calculation methods for maize canopy phenotypic parameters is based on reconstruction of the three-dimensional (3D) structure of the maize canopy [23]. At present, the methods for 3D maize canopy acquisition mainly include the use of 3D digitizers, hand-held sensors, binocular stereo vision technology, laser scanning technology and unmanned aerial vehicle (UAV) remote-sensing technology [24]. It is time-consuming to manually measure a large amount of geometric information from crop canopies when reconstructing 3D plant morphology with a 3D digitizer [25]. Hand-held sensors use a contact method to measure geometric parameters, which leads to deformation of the plant canopy [26]. Furthermore, visible light-based binocular stereo vision technology possesses advantages in the non-contact and non-destructive acquisition of crop canopy images without destroying crop growth patterns, resulting in significant improvement of the efficiency of data acquisition [27], which is helpful for monitoring crop emergence rates, flowering dynamics, canopy coverage and lodging during crop growth [28]. However, the quality of images acquired by this technique is greatly affected by light conditions; hence, the accuracy of the calculation of phenotypic parameters needs to be further improved. Laser [29] or light detection and ranging (LIDAR) [15] sensors are active remote-sensing devices using laser as the emitting light source and photoelectric detection. These devices present the advantages of a high resolution, strong anti-jamming ability and good detection performance at low altitudes, and can quickly obtain high-precision horizontal and vertical structure parameters of plant canopies. However, due to their high cost and the massive data processing involved, these devices are generally only applied in the analysis of tree biomass [30] and are rarely used in crop phenotypic research. In addition, the analysis of crop phenotypic traits based on UAV multi-sensor platforms presents obvious superiority, due to factors such as a low cost, high efficiency, and high-resolution data acquisition, as well as synchronous acquisition of multi-source images [31]. This strategy has been widely used in analyzing parameters such as plant height, chlorophyll content, and nitrogen content [32]. However, it is difficult to measure other important phenotypic traits of crops, such as the leaf inclination and stem diameter of maize, with UAV remote-sensing technology. High-throughput phenotyping platforms in greenhouses with controlled growth conditions are alternative approaches for automatically measuring geometric dimensions of plants [33,34]. Hyperspectral imaging technology [35], two dimensional digital camera [36] and Structure from Motion (SfM) method [37] contribute significantly to the automatic phenotyping greenhouses in the aspects of morphological characteristics [38] and stress resistance [39]. Although these platforms can be operated automatically with good repeatability, they also show limitations. The main shortcoming is that certain traits calculated from an individual plant can not reflect the true cases in the natural environment due to the artificial laboratory conditions.
Although these excellent sensors have been widely used in the high-throughput acquisition of plant phenotypic parameters, there are still some limitations in obtaining specific phenotypic traits. In addition, advanced improvement is warranted concerning the accuracy of plant 3D reconstruction and the calculation accuracy of phenotypic traits.
Hand-held laser scanning, which shows benefits in the high-precision acquisition of 3D point clouds, plays a crucial role in plant 3D reconstruction and the calculation of plant phenotypic traits, especially in acquiring more specific phenotypic traits of crops [40] (e.g., stem diameter, leaf inclination angle, blade angle). The hand-held scanner can digitize the surface of any object rapidly and conveniently and displays the morphological structure of the target object in 3D space, providing an effective data source for further calculation of phenotypic traits.
In this context, to compensate for the shortcomings of the research results above and improve the calculation accuracy for phenotypic traits, a 3D structural model of maize plants was established through FastSCAN hand-held laser scanning (Cobra™, Aranz Scanning Ltd., New Zealand, Christchurch). On this basis, the calculation method for maize phenotypic traits was subsequently studied. The main aims of this study were as follows: (1) to simplify raw point clouds using the grid method; (2) to progressively remove the effect of noise on point cloud quality in the maize canopy with a bilateral filtering algorithm; and (3) to calculate phenotypic traits of the maize canopy, including plant height, stem diameter and canopy breadth by fitting spheres and cylinders based on 3D reconstruction of the maize canopy. The results of this research may provide technical support for further exploration of the phenotypic traits of other crops and the breeding of crop varieties.

2. Materials and Methods

2.1. Experimental Treatments and Measurement of Phenotypic Traits

From May to September 2018, maize planting and data acquisition were carried out in the Innovation and Entrepreneurship Training Park for Excellent Agricultural Talents of the Agronomy College of Heilongjiang Bayi Agricultural University (46°62′ N, 125°20′ E), which is located in the north temperate zone, with a continental monsoon climate. The average temperature is 4.2 °C, the annual average frost-free period is 143 days, annual rainfall is 427.5 mm, and annual evaporation is 1635 mm. The effective accumulated temperature is 2600 °C, and there are 1147.8 sunshine hours.
The pot cultivation method was adopted in the experiment. Each pot (Polyvinyl chloridematerial, 32 cm in diameter and 26 cm in height) was filled with 10 kg of soil (dried soil). Natural dried and full maize seeds with a consistent size were selected. The two varieties of maize seeds were soaked and disinfected with 1% NaClO for 5 min and then rinsed with distilled water 3 times. Finally, the sterilized seeds were placed in a germination box lined with two layers of wet filter paper and then placed in a 25 °C constant temperature box for one day. The soil tested was meadow soil, and maize was planted at 10-day intervals in three batches. Each variety was repeated five times and placed randomly. Each pot was sown evenly, and the seedlings were transplanted individually at the trefoil stage after emergence. The plants were watered to 70 percent of the soil moisture content each time; other aspects were in accordance with regular management practices. The study of phenotypic traits from the trefoil stage to the jointing stage is particularly important among the total growth stages of maize. Accordingly, phenotypic traits including plant height, stem diameter and canopy breadth were measured with rulers and measuring tapes from the trefoil stage to the jointing stage, to verify the validity of the method for calculating phenotypic traits.

2.2. Data Acquisition Device

To reconstruct the precise 3D structure of the maize canopy and improve the accuracy of point cloud data in the maize canopy, emphasis should be placed on the improvement of data acquisition technology for the maize canopy. In this study, a FastSCAN Cobra™ hand-held 3D laser scanner was used to acquire 3D point cloud data for the maize canopy. The FastSCAN device can scan non-metallic and opaque objects. Based on the operational principle, the scanner records data on a contour section of an object surface by projecting a laser beam. The embedded motion tracking technology is applied to the position and orientation of the detection handle and to reconstruct the object in three dimensions jointly. During the process, the resolution of the device is 0.178 mm in the range of 200 mm from the scanned object, and the scanning speed is 50 lines per second. The distance between lines depends on the moving speed of the laser head, and the resolution is 1 mm at the moving speed of 50 mm per second. The speed at which the wand is moved over the surface of the object is the major determinate of the resolution of the sweeps. The number of raw 3D cloud points increased with the growth of maize from over 10,000 points in the 3-leaves period to over 40,000 points in the 8-leaves period.
Figure 1 is a schematic diagram of the data acquisition process. The scanning effect can be viewed in real time with FastSCAN software installed in a laptop. To ensure both a high resolution and high accuracy, the wand should be held close to the maize plant’s surface during scanning, but no closer than 80 mm to remain within the camera’s field of view. Parts of the surface closer to receiver will be scanned more accurately. The transmitter should be kept close to the wand and receiver, as accuracy deteriorates with distance, but no closer than 100 mm to avoid signal overload. The maximum separation of any two components is approximately 750 mm, but they should be kept as close as possible for best results.

2.3. Overall Process Flow for Calculating Phenotypic Traits

In this study, a method based on the use of a FastSCAN hand-held laser scanner was proposed to obtain a point cloud for a maize canopy and accurately reconstruct the 3D structure of maize plants. On this basis, the phenotypic traits of maize plants were calculated through the steps listed below.
First, the original 3D point clouds of maize plants from the trefoil stage to the jointing stage were obtained with a FastSCAN hand-held laser scanner. Then, to avoid the interference of the large amount of data redundancy generated by the 3D scanner and subsequent space occupation, and to further improve the speed of data processing, a simplification approach for the 3D point cloud with adaptive density based on the grid method was used to simplify the raw 3D data of the maize plants. On this basis, bilateral filtering was applied to progressively denoise the maize canopy to reduce the impact of noise on the accuracy of the 3D reconstruction of the maize canopy. In addition, phenotypic traits including plant height, stem diameter and canopy breadth were calculated on the basis of accurate reconstruction of the 3D structure of maize plants. In the last step, the validity of the algorithm proposed here was verified through linear regression analysis of the calculated and measured phenotypic traits. Figure 2 shows the flow chart for calculating the phenotypic traits of maize plants.

2.4. Pre-Processing of the Raw 3D Point Cloud

Due to the influence of the device itself and the external environment, the scanner may produce redundant data and noise in the process of obtaining the 3D point cloud of maize plants. Consequently, to obtain an accurate 3D structure of a plant, it is necessary to simplify the raw 3D point cloud and remove noise in advance, to improve the data processing speed and model accuracy. In this study, based on the characteristics of the point cloud data collected from the maize canopy, a simplification approach for 3D point clouds with adaptive density based on the grid method and a bilateral filtering algorithm for maize plants were proposed, which laid a foundation for accurately reconstructing the 3D structure of the maize canopy and calculating related phenotypic traits.

2.4.1. Simplification Algorithm for Raw Data

The simplification of a point cloud is indispensable in 3D point cloud preprocessing. Data redundancies are not conducive to subsequent accurate 3D reconstruction, making the calculation time-consuming and directly affecting the speed and accuracy of storage and data processing. In this regard, it is necessary to simplify the raw 3D point cloud data.
To preserve the feature information of the point cloud, an adaptive density reduction method for 3D point clouds based on the grid method was proposed to simplify the original point cloud of the maize plants. In the first step, after reading all points in the original point cloud model, a spatial data index was established based on 3D grid method for the 3D point cloud. Thereafter, a cuboid box was established with the three sides paralleling the three axes of the coordinate system. The grid was therefore constructed with the coordinate of any point in the original 3D point cloud as p i ( x i , y i , z i ) , and the side length was expressed as follows:
X b o x = X m a x X m i n Y b o x = Y m a x Y m i n Z b o x = Z m a x Z m i n
where, X b o x , Y b o x and Z b o x represents the maximum range of the point cloud p i ( x i , y i , z i ) along the x , y and z axes, respectively. The effect of point cloud simplification is affected by the size of the bounding box, and the side length of the bounding box has to be increased accordingly if the raw point cloud data are simplified. On the contrary, the average density of the point cloud increases with a decrease in the side length of the bounding box. The side length of the grid is as follows:
D = γ / ( X b o x Y b o x Z b o x / N ) 3
where γ is the coefficient of proportionality and N is the number of original 3D point clouds. The adjustable side length of the grid is defined as follows:
D = β D
where β refers to the factor of proportionality that may be available for the adjustment of the side length of the grid. By integrating Formulas (2) and (3), the following formula can be obtained:
D = β γ X b o x Y b o x Z b o x / N 3
Analysis of the differences in the curvature and density of the 3D point clouds in different parts of the maize plant organs revealed that the density of the 3D point clouds of maize stems was relatively uniform, with relatively smaller deviation in the curvature of point clouds. Compared with the stems, the 3D point cloud density of maize leaves was greater, which was associated with relatively greater curvature deviation of point clouds. Furthermore, the proportionality factor β was utilized to adjust the side length of the grille, where the latter factor directly affected the efficiency of reduction. Hence, it was necessary to determine the curvature of the point cloud in space to select the appropriate proportionality factor β .
Covariance analysis is a common principal component analysis method for estimating normal vector and curvature selection. The covariance matrix of the point set is as follows:
C = [ p i k p ¯ p i l p ¯ ] T · [ p i k p ¯ p i l p ¯ ]
where p ¯ = p i k / k is the centroid of N k ( p i ) , p i k N k ( p i ) . The value λ ( i = 0 , 1 , 2 ) of matrix C proper is a non-negative intrinsic value, and the corresponding three eigenvectors v i ( i = 0 , 1 , 2 ) form an orthogonal basis. Plane ( x ¯ p ¯ ) · v 0 = 0 that minimizes the distance from the surrounding point of p to the plane is referred to a tangent plane; v o is the normal vector of the local surface at point p i ; and eigenvalue λ o is the variation of local surfaces along normal vectors. The calculation showed that the curvature of σ k ( p i ) was close to that of p i , which could therefore be used as the curvature value at this point.
σ k i = σ k ( p i ) = λ o / λ o + λ 1 + λ 2
With the leaf point cloud set as Ω = { P i ( x i , y i , z i ) | 1 i n } , and sampling point set as Q = { q i ( x i , y i , z i ) | 1 i n } , for any point q i ( x , y , z ) in the set of sampling points, the curvature threshold can be expressed as follows:
D = i = 1 N σ i N
With the number of point clouds in the bounding box set as K , and point set as W = { P i ( x i , y i , z i ) | 1 i K } , for any point p i ( x , y , z ) , if the curvature σ i of the point is calculated, then the mean curvature of the point cloud is obtained as follows:
D i = i = 1 N σ i K
If D i D , the grid is a non-detail bounding box with a relatively low curvature requirement and low density. By contrast, if D i > D , then the grid is a detail bounding box with a high requirement of curvature and high density. Therefore, the proportionality factor is adjusted to screen the grid size, and the redundant 3D point cloud is simplified from the maize point cloud data.

2.4.2. Denoising Algorithm for Raw Data

For better removal of noise in the 3D reconstruction of the maize canopy, it was necessary to divide the feature region of the maize canopy. The average curvature of the 3D point cloud was used to divide the maize plant area, where the single area had less feature information, and the rich area possessed more feature information.
The average curvature of any point p i ( x , y , z ) in the 3D point cloud is as follows:
D ¯ = i = 1 N σ i N
To denoise different feature regions, the local eigenvalues at any point p i ( x , y , z ) in the 3D point cloud were compared with the threshold values. If the threshold was greater than the local eigenvalue, the point was marked as a single region (or conversely as a rich region). Additionally, different regions were divided for denoising, and the results of region division are shown in Figure 3. The maize plants were colorized with a gradient ramp of blue. The red parts of the plant represented the 3D cloud points with a single region, while the other parts, accounting for the majority of the maize plant, were used as rich regions, which had to be denoised further.
Subsequently, the 3D point clouds of maize with rich features were denoised with a bilateral filtering algorithm. In this study, the Jones bilateral filtering algorithm was applied to preserve the contour features of 3D point clouds in maize processing [41,42]. The algorithm is shown with Formula (10):
p = p + α × n
where p represents the initial point cloud data; n is the normal vector; α is the bilateral filter coefficient; and p refers to the point cloud data obtained after filtering. The calculation formula of α can be expressed as follows:
α = i = 1 k w 1 ( p p i ) w 1 ( p p i , n p p i , n ) i = 1 k w 1 ( p p i ) w 2 ( p p i , n )
where w 1 and w 2 represent the weights in the spatial and frequency domains of the bilateral filtering function, respectively, both of which control the smoothness and feature-preservation of bilateral filtering. Additionally, k is the sampling point in the nearest neighborhood of the sampling point. The concrete forms of w 1 and w 2 are shown with Formulas (12) and (13):
w 1 ( x ) = e x 2 2 σ 1 2
w 2 ( y ) = e x 2 2 σ 2 2
In the formulas, parameter σ 1 is the influence factor of the distance from point cloud data p to neighboring points on point p . The value of σ 1 is positively correlated with the number of neighborhood points, which is in positive proportion to the filtering effect, but inversely proportional to the ability of point cloud feature preservation. Furthermore, parameter σ 2 is the influence factor of the projection of the distance vector from data point p to the adjacent point on the normal n of the point in the point cloud data p . Parameter σ 2 regulates the degree of feature preservation of the point cloud data in the filtering process, which is proportional to the effect of feature preservation. Normally, σ 1 is the neighborhood radius of the point, and σ 2 is the standard deviation of the neighborhood point.
σ 1 = m a x p p i , i [ 1 , k ]
σ 2 = 1 k 1 i = 1 k ( ξ i ξ ¯ ) 2 , x = p p i , n
The specific steps are as follows:
(1) Calculate the adjacent point k of each data point p i in the region of the 3D model rich in feature information;
(2) For each adjacent point k , calculate the value of x = p p i in parameter w 1 ( x ) and the value of y = p p i , n in parameter w 2 ( y ) ;
(3) On the basis of σ 1 and σ 2 , calculate the value of w 1 ( x ) and w 2 ( y ) according to Formulas (12) and (13);
(4) Calculate the bilateral filtering factor, α , and then obtain the new point cloud data after filtering by moving the points in the feature-rich region in the normal direction using Formula (9);
(5) Traverse all the original point clouds to obtain the filtered new point cloud data.

2.5. Calculation Method for Phenotypic Traits

In view of current methods for 3D reconstruction of maize plants, the mathematical method was included in this study to describe plant organs, with the aim of reducing data processing and providing objective descriptions of the repeatability and parameterization of growth processes. In this study, the phenotypic traits of plant height, stem diameter and canopy breadth were calculated, which play an important role in evaluating the growth of maize.

2.5.1. Calculation Method for Plant Height

Plant height refers to the height difference from the ground to the highest natural extension of leaves. Plant height is an important trait in maize variety cultivation that directly affects the lodging resistance and harvest potential of maize varieties. Therefore, it is of great significance for maize breeding to measure plant height rapidly and accurately. In this study, the y coordinate was taken as the axis, the highest point of the maize canopy in different periods as the vertex, the vertex as the center of the circle, and the lowest point in the vertical direction of the vertex as the base point to form the fitting sphere [43]. The radius of the fitting sphere was the plant height of maize (Figure 4).
To calculate the radius of the fitting sphere and obtain the plant height, the least squares method was used to calculate the plant height of maize. The distance function from any point P i of the fitting sphere to the center of the sphere is as follows:
d ( s , P i ) = | | c P i | |
where c = ( S 1 , S 2 , S 3 ) T represents the spherical coordinates.
According to the above calculation method for plant height, the distance from any point on the sphere surface to the spherical center was calculated (i.e., the radius of the fitting sphere) as the plant height of maize from the trefoil stage to the jointing stage.

2.5.2. Calculation Method for Stem Diameter

Stem growth can indirectly reflect crop growth, vigor, lodging resistance, and so on. Domestic and foreign research has also focused on the measurement of crop stems. For example, the research team at Osnabrück University in Germany designed the high-throughput phenotypic measurement robots Breed Vision [44] and BoniRob [45,46] to measure crop stems and other phenotypic traits by using various optical sensors, such as light curtain imaging, 3D time-of-flight cameras and laser sensors, not only for individual plant phenotyping but also for non-destructive field-based phenotyping in plant breeding. Additionally, Paulus et al. [47,48,49] scanned cereal plants with a hand-held laser scanner and reconstructed 3D models to obtain the stem parameters of crops. Although the above study demonstrated the feasibility of the calculation method for stem parameters involving a hand-held laser scanner, this result was not for stem diameter but for stem height in barley. Stem diameter is a key trait in maize breeding. Thus, to calculate stem diameter accurately, the fitting cylinder was constructed based on a method of coordinate transformation [50]. The diameter of the cylinder was the stem diameter of the maize stem, which consists of nodes and internodes. The diameter of the third internode counted from the bottom to the top was used to indicate the growth status of stems (Figure 5).
The distance equation from any point P i on the surface of the cylinder to the center of the circle is as follows:
d ( s , P i ) = | | q 0 p i | | 2 ( a 0 ( q 0 p i ) ) 2
where q 0 = ( S 1 , S 2 , S 3 ) T refers to the center coordinate and a 0 = ( S 1 , S 2 , S 3 ) represents the unit direction vector of the cylindrical axis.

2.5.3. Calculation Method for Canopy Breadth

Canopy breadth is one of the important reference standards for measuring plant growth. In this study, based on the x-coordinate axis, the left-most point of the maize canopy in different periods was selected as the center of the circle [43]. At the same height as the center, the fitting sphere was generated based on the radius of the distance from the center to the right-most point, and its radius was the plant width of maize (Figure 6). Then, the canopy breadth was obtained with Formula (16).

3. Results

3.1. Acquisition of Raw Data

When using the FastSCAN 3D scanner to reconstruct the 3D model of the whole maize plant, it was necessary to ensure that the electromagnetic reference body (consisting of a transmitter and a receiver) and the maize plant were in a relatively static state and that neither side could move during the scanning process. To improve scanning quality, the hand-held device (Wand) and the part of maize plant to be scanned were kept perpendicular during the scanning process.
Representative maize plants are shown in Figure 7. The raw 3D point clouds of maize plants from the trefoil stage to the jointing stage were obtained with the FastSCAN 3D scanner, which are shown in Figure 8.

3.2. Simplification Effect of Raw Data

To verify the effectiveness of the simplification method, the raw 3D point cloud data of maize plants from the trefoil stage to the jointing stage were simplified. Point cloud simplification was realized on the premise of retaining the details of the 3D model of the maize canopy, and the simplification rate was regarded as the evaluation index, which shown as follows:
r a t e = P P i P i × 100 %
where P represents the total number of original 3D point clouds; P i indicates the total number of reduced point clouds; and r a t e represents the percentage of the total number of simplified 3D point clouds within the raw total.
Furthermore, the proposed adaptive curvature was applied to simplify the number of points in a flat area, but with the feature details of point clouds being retained in high-curvature areas. According to the height, the raw point clouds were progressively reduced from high to low. Taking randomly selected maize plants as an example, the details of the raw point cloud simplification effect of leaves and stems are illustrated in Figure 9.
The maize canopy, especially the maize leaves and stems, was gradually simplified from the trefoil stage to the jointing stage. The number of point clouds and the reduction rate after the reduction are presented in Table 1.
The simplification rate of the raw point cloud in the maize canopy was approximately 25%, which ensured the validity of the simplified point cloud and reduced the processing time of the raw point cloud data. Figure 10 presents the overall simplification effect of the 3D point cloud of maize from the trefoil stage to the jointing stage. Maize plants were simplified via adaptive curvature reduction, and the point cloud model could still retain the morphological characteristics of maize plants. Point cloud simplification was achieved here based on the curvature of the point cloud. The curvature value was positively correlated with the density of the point cloud. Additionally, the degree of point cloud retention was higher in dense areas than in sparse areas.

3.3. Denoising Effect of Raw Data

Aiming at an objective evaluation of the denoising algorithm proposed herein, the denoising of the reduced point cloud data of maize plants was processed from the trefoil stage to the jointing stage, followed by comparison of the denoising effect with that of the classical Laplace denoising algorithm. Two objective evaluation indexes, the maximum error and average error, were selected for quantitative analysis of the denoising effect. The comparison results are shown in Table 2. The maximum error was included to measure the maximum distance of point cloud data movement, which was negatively correlated with the quality of the point cloud data. Taking randomly selected maize plants as an example, the denoising effects of some leaves were compared, as shown in Figure 11. The surface of leaves are much smoother using the bilateral filtering algorithm than the effect using the Laplace algorithm, and the edge details of leaves are also preserved well.
The maximum error of the proposed algorithm was 21.1% lower than that of the traditional Laplace filtering algorithm. Additionally, the average error measured the average distance of point cloud data movement, which was positively correlated with the denoising effect of the point cloud. The average error of this algorithm was 38.2% lower than that of the Laplace filtering algorithm. Figure 12 indicates the overall denoising effect of the 3D point cloud of maize from the trefoil stage to the jointing stage. The noise existing in the raw 3D point cloud was labeled with red.

3.4. Effectiveness of the Calculation Method for Phenotypic Traits

Ten representative potted maize plants were selected, and phenotypic traits including plant height, stem diameter and canopy breadth were calculated from the trefoil stage to the jointing stage by using the calculation method indicated in Section 2.5, which included 6 periods (3-leaves, 4-leaves, 5-leaves, 6-leaves, 7-leaves and 8-leaves). We calculated and measured the phenotypic traits of 10 potted plants according to each period. Thus, there were a total of 6 periods and 10 samples for each period. The calculated values were compared with the actual measured values.
As indicated in Figure 13a, the calculated value was highly correlated with the measured value based on the calculation method for maize plant height proposed in this study (R2 = 0.9807). According to Figure 13b, the method of stem diameter measurement for maize had a better processing effect and could measure the stem diameter more accurately. The determination coefficient, R2, of the calculated and measured values was 0.8907. In Figure 13c, the calculated and measured values of canopy breadth are still highly correlated, and the determination coefficient, R2, reached 0.9562., which corresponds to the actual phenotypic traits and proves the validity of the proposed method.

4. Discussion

4.1. Analysis andComparison of the Experimental Results

Manual measurement of the actual phenotypic traits was affected not only by subjective factors related to the surveyor but also by the external environment (e.g., the motion of maize leaves with the wind). Thus, to acquire traits more accurately, actual measurement and data acquisition should be carried out indoors or in a windless environment. In addition, when calculating plant height, stem diameter and canopy width via the method presented in this study, the selection of boundary points [50] in the 3D model of the maize canopy, such as the highest point, lowest point, leftmost point and rightmost point, is also an important factor affecting the calculation accuracy for phenotypic traits.
The calculation accuracy for stem diameter was lower than its counterparts for other traits. In addition to the impact of the external environment and the selection of boundary points, some measurement errors could result from the great interference of the point cloud data at the edge of the maize stem with cylinder fitting. Accordingly, the interference of individual points with cylindrical fitting could be reduced and the accuracy and stability of measurement could be promoted by improving the cylinder fitting algorithm [51,52].
Besides the hand held laser scanner used here, other laser scanning devices, such as a terrestrial laser scanner (Trimble TX8, Danderyd, Sweden) [53], and range cameras like the Kinect v2.0 sensor (Kinect-v2, Microsoft, Redmond, WA, USA) [54] are good selections for phenotying analysis of plants due to their high resolutions, which have been applied in fruit trees and soybean plants respectively in our previous research.
Although data within 120 m can be acquired using a Trimble TX8 3D laser scanner, pre-processing of huge data including lots of redundant data, and registration algorithms of point clouds restrict the speed of 3D reconstruction for plants. Moreover, it takes more time than Trimble TX8 3D laser scanner when acquiring 3D point clouds. For the Trimble TX8, data of at least three stations are needed for reconstructing a maize plant, and it takes at least 5 min to get the data of each station. In order to reconstruct 3D model of a maize plant using data of the three stations, although Trimble realworks software (version: 11.1) [55] can be regarded as a good tool, it will take at least 10 min for reconstruction due to the huge amount of 3D data involved in registration. Nevertheless, it only takes 15 min at most to acquire a 3D model of a maize plant using FastSCAN hand held laser scanner. Thus, the Trimble TX8 laser scanner was not adopted to acquire data of maize plants for comparison with the counterpart of the laser scanner used in this study.
As a new kind of range camera, the red-green-blue-depth (RGB-D) camera consisted of a RGB (red, green, and blue) camera, a depth sensor and infrared emitters has been used extensively in numerous applications [56,57] due to the advantages of low cost and fast speed. Kinect v2.0 sensor is a representative of such cameras and is used extensively in phenotyping analysis of plants [58]. The Kinect v2.0 sensor, originally designed for natural interaction in computer gaming environments, can not only acquire RGB images (1920 by 1080 pixels), but also depth images (512 by 484 pixels) as well as infrared images (512 by 484 pixels) of maize canopies simultaneously with a field of view (FOV) of 70 degrees (H) by 60 degrees (V) and 30 frame rate per second [59]. The Kinect v2.0 sensor was used to acquire 3D point clouds of maize plants and calculate the same phenotypic traits due to its high speed for comparing the accuracy with values calculated using the approach here.
Two shooting angles were used to acquire 3D point clouds of maize plant (Figure 14). 3D point clouds under top view were prepared for calculation of plant height and canopy breadth, while those under side view were used for calculation of stem diameter.
The determination coefficients (R2) were found to be 0.9858, 0.6973 and 0.8854 for plant height, stem diameter and canopy breadth respectively, based on 3D data of the Kinect v 2.0 sensor (Table 3). Compared to the values of phenotypic traits calculated in this study, higher accuracy for plant height was achieved using the Kinect v 2.0 sensor, while R2 values for stem diameter and canopy breadth were relatively lower than the counterparts of FastSCAN. Resolution was the first factor impacting on accuracy. With increasing distance, the accuracy of the 3D point cloud decreases from a standard deviation (SD) of a few millimeters to approximately 40 mm, and the point-to-point distance decreases from 2 mm to 4 mm under a FOV of 70 degrees (H) by 60 degrees (V) for the Kinect V2 sensor [59]. In addition, the loss of pixels at the edge of the leaves and stems was another important factor affecting the accuracy, which was worse than that of FastSCAN. Thus, the FastSCAN hand held laser scanner was an excellent device for acquiring the phenotypic traits of maize plants compared to the Kinect v 2.0 sensor in this study.

4.2. Advantages and Limitations of the Acquisition System

The present study verified the ability to use a FastSCAN Cobra™ hand-held 3D laser scanner and proposed algorithms for calculating the plant height, stem diameter and canopy breadth of maize plants in an accurate fashion. However, three factors restricted the performance of the proposed approach. First, the FastSCAN Cobra™ hand-held 3D laser scanner is sensitive to strong light, which limits the operation environment of the device. Thus, use of the scanner indoors or under shade outside, without wind, will ensure its effective application. Second, the acquired raw 3D data lack true color information, which is not conducive to research on the color characteristics of a plant canopy. This limitation can be addressed using multi-sensor fusion methods [60,61], such as the combination of 3D point clouds with color information through the registration of a coordinate system between depth sensors and the visible light imaging sensor [62]. Third, the quality of the point cloud and the 3D reconstruction effect is affected by manual scanning operations; redundant points and noise points will increase as the same area is scanned repeatedly, which will occur if the scanning effect of a certain area of a maize plant is not ideal. Although repeated scanning is an unavoidable operation, much improved simplification and denoising methods [63] can be developed to remove the noise of 3D cloud points and improve 3D reconstruction accuracy.
It is important to choose a suitable device for specific phenotyping analysis. Thus, researchers must have an in-depth knowledge of the advantages and limitations of each type of imaging system to accomplish the calculation of phenotypic traits with good accuracy, and efficiency at an affordable cost. The advantages and limitations of common sensors for acquiring geometric traits are shown in Table 4. We finally selected the FastSCAN hand held laser scanner for calculating plant height, stem diameter and canopy breadth according to accuracy and the highest cost performance through comparing their strengths and weaknesses. Although its short measurement range and handheld use have been challenges for the application of FastSCAN in the context of high-throughput phenotype using UAV and other mobile devices such as robotic arms or a vehicle platform, its high resolution and almost perfect 3D modeling effect contributes to its outstanding advantage for calculating fine phenotypic traits, such as stem diameter and leaf inclination angle (LIA) [64].

4.3. Future Work

The plant height, stem diameter and canopy breadth of two varieties of maize plants were calculated in this study. From a breeding perspective, a promising approach that should be considered further in future work is the acquisition of other phenotypic traits, such as the leaf inclination angle (LIA) [64], leaf area index (LAI) [65], leaf area (LA) and color indices (CI) [66], to the greatest extent possible. LIA indicates the water stress of plants and impacts on the measurement of LAI [67]. LA refers to an individual leaf, the estimation of LA is important to biometrical observation for a single plant [68]. LAI is not only a ratio of the total area of plant leaves to the land area, but also a comprehensive index of the utilization of light energy within the canopy structure [69]. CI reflects the nutritional status of crops [70]. Additionally, parameters related to the photosynthetic capacity and nutritional status, such as the chlorophyll content and nitrogen content, will be measured during the entire growth stage of maize plants, and the relationship between LIA and LAI [71], CI and nutrition contents will be studied [70].

5. Conclusions

In this study, we proposed a method for calculating phenotypic traits based on the 3D reconstruction of maize canopies. The major conclusions are as follows:
(1) An adaptive curvature simplification method of 3D point clouds based on the grid method was proposed. First, the curvature of the marked point cloud was calculated. Second, the size of the outer bounding box was controlled to reduce the 3D point cloud data of maize leaves. The experimental results showed that the whole point cloud was reduced by approximately 25% on the premise of guaranteeing the morphological characteristics of the maize canopy.
(2) Bilateral filtering was used to denoise the feature-rich regions of maize. The maximum error and average error of the proposed algorithm were 21.1% and 38.2% lower, respectively, than those of the traditional Laplace filtering algorithm, providing morphological information for the 3D modeling of maize in a more nuanced manner.
(3) Fitting spheres and cylinders were used to obtain plant height, stem diameter and canopy breadth. The determination coefficients R2 were found to be 0.9807, 0.8907 and 0.9562, respectively. The above experimental results suggested that the proposed method for 3D reconstruction of the maize canopy and the approach for calculating phenotypic traits exhibited high accuracy, providing technical support for further study of the phenotypic traits and breeding of other crops.

Author Contributions

H.G., X.M. and G.L. conceived and designed the experiments; J.F. and K.Z. performed the experiments and acquired the 3D data of maize canopy; X.M., J.F. and K.Z. calculated plant parameters; S.Y. directed the maize planting; X.M. wrote the paper.

Funding

This study was funded jointly by National Natural Science Foundation of China (31601220), Natural Science Foundation of Heilongjiang Province (QC2016031), China Postdoctoral Science Foundation (2016M601464), Support Program for Natural Science Talent of Heilongjiang Bayi Agricultural University (ZRCQC201806), Heilongjiang Bayi Agricultural University Innovative Research Team Foundation (TDJH201807).

Acknowledgments

The authors would like to thank the two anonymous reviewers, academic editors for their precious suggestions that significantly improved the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hölker, A.C.; Schipprack, W.; Utz, H.F.; Molenaar, W.S.; Melchinger, A.E. Progress for testcross performance within the flint heterotic pool of a public maize breeding program since the onset of hybrid breeding. Euphytica 2019, 215, 50. [Google Scholar] [CrossRef]
  2. Sserumaga, J.P.; Makumbi, D.; Warburton, M.L.; Opiyo, S.O.; Asea, G.; Muwonge, A.; Kasozi, C. Genetic diversity among tropical provitamin a maize inbred lines and implications for a biofortification program. Cereal Res. Commun. 2019, 47, 134–144. [Google Scholar] [CrossRef]
  3. Sarker, A.; Muehlbauer, F.J. Improving cultivation of lentil International Center for Agricultural Research in the Dry Areas (ICARDA), India. In Achieving Sustainable Cultivation of Grain Legumes; Burleigh Dodds Science Publishing: London, UK, 2018; Volume 2, pp. 93–104. [Google Scholar]
  4. Su, Y.; Wu, F.; Ao, Z.; Jin, S.; Qin, F.; Liu, B.; Guo, Q. Evaluating maize phenotype dynamics under drought stress using terrestrial lidar. Plant Methods 2019, 15, 11. [Google Scholar] [CrossRef] [PubMed]
  5. Xin, Y.; Tao, F. Optimizing genotype-environment-management interactions to enhance productivity and eco-efficiency for wheat-maize rotation in the North China Plain. Sci. Total Environ. 2019, 654, 480–492. [Google Scholar] [CrossRef] [PubMed]
  6. Biddick, M.; Burns, K.C. Phenotypic trait matching predicts the topology of an insular plant–bird pollination network. Integr. Zool. 2018, 13, 339–347. [Google Scholar] [CrossRef] [PubMed]
  7. Santangelo, J.S.; Thompson, K.A.; Johnson, M.T. Herbivores and plant defences affect selection on plant reproductive traits more strongly than pollinators. J. Evol. Biol. 2019, 32, 4–18. [Google Scholar] [CrossRef] [PubMed]
  8. Bolger, A.M.; Poorter, H.; Dumschott, K.; Bolger, M.E.; Arend, D.; Osorio, S.; Usadel, B. Computational aspects underlying genome to phenome analysis in plants. Plant J. 2019, 97, 182–198. [Google Scholar] [CrossRef]
  9. Guan, H.; Liu, M.; Ma, X.; Yu, S. Three-dimensional reconstruction of soybean canopies using multisource imaging for phenotyping analysis. Remote Sens. 2018, 10, 1206. [Google Scholar] [CrossRef]
  10. Zhang, Y.; Liu, P.; Zhang, X.; Zheng, Q.; Chen, M.; Ge, F.; Zheng, Y. Multi-locus genome-wide association study reveals the genetic architecture of salk lodging resistance-related traits in maize. Front. Plant Sci. 2018, 9, 611. [Google Scholar] [CrossRef]
  11. Bao, Y.; Tang, L.; Srinivasan, S.; Schnable, P.S. Field-based architectural traits characterisation of maize plant using time-of-flight 3D imaging. Biosyst. Eng. 2019, 178, 86–101. [Google Scholar] [CrossRef]
  12. Liu, W.G.; Hussain, S.; Liu, T.; Zou, J.L.; Ren, M.L.; Zhou, T.; Yang, F.; Yang, W.Y. Shade stress decreases stem strength of soybean through restraining lignin biosynthesis. J. Integr. Agric. 2019, 18, 43–53. [Google Scholar] [CrossRef]
  13. Wang, J.; Kang, S.; Du, T.; Tong, L.; Ding, R.; Li, S. Estimating the upper and lower limits of kernel weight under different water regimes in hybrid maize seed production. Agric. Water Manag. 2019, 213, 128–134. [Google Scholar] [CrossRef]
  14. Zhu, B.; Liu, F.; Zhu, J.; Guo, Y.; Ma, Y. Three-dimensional quantifications of plant growth dynamics in field-grown plants based on machine vision method. Trans. Chin. Soc. Agric. Mach. 2018, 49, 256–262. [Google Scholar]
  15. Sun, S.; Li, C.; Paterson, A.H. In-field high-throughput phenotyping of cotton plant height using LiDAR. Remote Sens. 2017, 9, 377. [Google Scholar] [CrossRef]
  16. Sheng, Y.; Song, L. Agricultural production and food consumption in China: A long-term projection. China Econ. Rev. 2019, 53, 15–29. [Google Scholar] [CrossRef]
  17. Han, S.; Miedaner, T.; Utz, H.F.; Schipprack, W.; Schrag, T.A.; Melchinger, A.E. Genomic prediction and gwas of gibberella ear rot resistance traits in dent and flint lines of a public maize breeding program. Euphytica 2018, 214, 6. [Google Scholar] [CrossRef]
  18. Pajares, G. Overview and current status of remote sensing applications based on Unmanned aerial vehicles (UAVs) photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar]
  19. Grosskinsky, D.K.; Svensgaard, J.; Christensen, S.; Roitsch, T. Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap. J. Exp. Bot. 2015, 66, 5429–5440. [Google Scholar] [CrossRef] [Green Version]
  20. Zhang, Y.; Zhang, N. Imaging technologies for plant high-throughput phenotyping: A review. Front. Agric. Sci. Eng. 2018, 5, 406–419. [Google Scholar] [CrossRef]
  21. Dreccer, M.F.; Molero, G.; Rivera-Amado, C.; John-Bejai, C.; Wilson, Z. Yielding to the image: How phenotyping reproductive growth can assist crop improvement and production. Plant Sci. 2018. [Google Scholar] [CrossRef]
  22. Qiu, R.; Wei, S.; Zhang, M.; Sun, H.; Li, H.; Liu, G. Sensors for measuring plant phenotyping: A review. Int. J. Agric. Biol. Eng. 2018, 11, 1–17. [Google Scholar] [CrossRef]
  23. Frasson, R.P.D.M.; Krajewski, W.F. Three-dimensional digital model of a maize plant. Agric. For. Meteorol. 2010, 150, 488. [Google Scholar] [CrossRef]
  24. Liu, J.; Zhao, C.; Yang, G.; Yu, H.; Zhao, X.; Xu, B.; Niu, Q. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform. Trans. Chin. Soc. Agric. Eng. (Trans. Csae) 2016, 32, 98–106. [Google Scholar]
  25. Schmidt, D.; Kahlen, K. Towards More Realistic Leaf Shapes in functional-structural plant models. Symmetry 2018, 10, 278. [Google Scholar] [CrossRef]
  26. Li, H. 3D Reconstruction of maize leaves based on virtual visual technology. Bull. Sci. Technol. 2016, 32, 96–101. [Google Scholar]
  27. Guo, W.; Fukatsu, T.; Ninomiya, S. Automated characterization of flowering dynamics in rice using field-acquired time-series RGB images. Plant Methods 2015, 11, 1–15. [Google Scholar] [CrossRef] [PubMed]
  28. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  29. Ma, X.; Feng, J.; Guan, H.; Liu, G. Prediction of Chlorophyll Content in Different Light Areas of Apple Tree Canopies based on the Color Characteristics of 3D Reconstruction. Remote Sens. 2018, 10, 429. [Google Scholar] [CrossRef]
  30. Vicari, M.B.; Pisek, J.; Disney, M. New estimates of leaf angle distribution from terrestrial LiDAR: Comparison with measured and modelled estimates from nine broadleaf tree species. Agric. For. Meteorol. 2019, 264, 322–333. [Google Scholar] [CrossRef]
  31. Wu, Z.; Ni, M.; Hu, Z.; Wang, J.; Li, Q.; Wu, G. Mapping invasive plant with UAV-derived 3D mesh model in mountain area-A case study in Shenzhen Coast, China. Int. J. Appl. Earth Obs. Geoinf. 2019, 77, 129–139. [Google Scholar] [CrossRef]
  32. Sankaran, S.; Khot, L.R.; Carter, A.H. Field-based crop phenotyping: Multispectral aerial imaging for evaluation of winter wheat emergence and spring stand. Comput. Electron. Agric. 2015, 118, 372–379. [Google Scholar] [CrossRef]
  33. Campbell, Z.C.; Acosta-Gamboa, L.M.; Nepal, N.; Lorence, A. Engineering plants for tomorrow: How high-throughput phenotyping is contributing to the development of better crops. Phytochem. Rev. 2018, 17, 1329–1343. [Google Scholar] [CrossRef]
  34. Zhou, J.; Fu, X.; Schumacher, L.; Zhou, J. Evaluating Geometric Measurement Accuracy Based on 3D Reconstruction of Automated Imagery in a Greenhouse. Sensors 2018, 18, 2270. [Google Scholar] [CrossRef] [PubMed]
  35. Thomas, S.; Behmann, J.; Steier, A.; Kraska, T.; Muller, O.; Rascher, U.; Mahlein, A.K. Quantitative assessment of disease severity and rating of barley cultivars based on hyperspectral imaging in a non-invasive, automated phenotyping platform. Plant Methods 2018, 14, 45. [Google Scholar] [CrossRef] [PubMed]
  36. Zhou, J.; Chen, H.; Zhou, J.; Fu, X.; Ye, H.; Nguyen, H.T. Development of an automated phenotyping platform for quantifying soybean dynamic responses to salinity stress in greenhouse environment. Comput. Electron. Agric. 2018, 151, 319–330. [Google Scholar] [CrossRef]
  37. Jay, S.; Rabatel, G.; Hadoux, X.; Moura, D.; Gorretta, N. In-field crop row phenotyping from 3D modeling performed using Structure from Motion. Comput. Electron. Agric. 2015, 110, 70–77. [Google Scholar] [CrossRef]
  38. Mir, R.R.; Reynolds, M.; Pinto, F.; Khan, M.A.; Bhat, M.A. High-throughput Phenotyping for Crop Improvement in The Genomics Era. Plant Sci. 2019. [Google Scholar] [CrossRef]
  39. Khanna, R.; Schmid, L.; Walter, A.; Nieto, J.; Siegwart, R.; Liebisch, F. A spatio temporal spectral framework for plant stress phenotyping. Plant Methods 2019, 15, 13. [Google Scholar] [CrossRef]
  40. Hosoi, F.; Kenji, O. Estimating vertical plant area density profile and growth parameters of a wheat canopy at different growth stages using three-dimensional portable lidar imaging. Isprs J. Photogramm. Remote Sens. 2009, 64, 151–158. [Google Scholar] [CrossRef]
  41. Xia, C.; Shi, Y.; Yin, W. Obtaining and denoising method of three-dimensional point cloud data of plants based on TOF depth sensor. Trans. Chin. Soc. Agric. Eng. (Trans. Csae) 2018, 34, 168–174. [Google Scholar]
  42. Xue, A.; Ju, S.; He, W.; Chen, W. Study on algorithms for local outlier detection. Chin. J. Comput. 2007, 30, 1455–1463. [Google Scholar]
  43. Zhang, L.; Cheng, X.; Tan, K. Automatically sphere target extracting and parameter fitting based on intensity image. Geotech. Investig. Surv. 2014, 12, 65–69. [Google Scholar]
  44. Busemeyer, L.; Mentrup, D.; Möller, K.; Wunder, E.; Alheit, K.; Hahn, V.; Rahe, F. BreedVision-A multi-sensor platform for non-destructive field-based phenotyping in plant breeding. Sensors 2013, 13, 2830–2847. [Google Scholar] [CrossRef] [PubMed]
  45. Ruckelshausen, A.; Biber, P.; Dorna, M.; Gremmes, H.; Klose, R.; Linz, A.; Weiss, U. BoniRob: An autonomous field robot platform for individual plant phenotyping. Precis. Agric. 2009, 9, 841–847. [Google Scholar]
  46. Ralph, K.; Jaime, P.; Arno, R. Usability study of 3D time-of flight cameras for automatic plant phenotyping. Bornimer Agrartech. Ber. 2009, 69, 93–105. [Google Scholar]
  47. Paulus, S.; Dupuis, J.; Mahlein, A.K.; Kuhlmann, H. Surface feature-based classification of plant organs from 3D laser scanned point clouds for plant phenotyping. Bmc Bioinform. 2013, 14, 1–12. [Google Scholar] [CrossRef] [PubMed]
  48. Paulus, S.; Schumann, H.; Kuhlmann, H.; Léon, J. High-precision laser scanning system for capturing 3D plant architecture and analyzing growth of cereal plants. Biosyst. Eng. 2014, 121, 1–11. [Google Scholar] [CrossRef]
  49. Paulus, S.; Dupuis, J.; Riedel, S.; Kuhlmann, H. Automated analysis of barley organs using 3D laser scanning: An approach for high throughput phenotyping. Sensors 2014, 14, 12670–12686. [Google Scholar] [CrossRef]
  50. Yan, Y.; Wang, J. Cylindrical fitting method of laser scanner point cloud data. Sci. Surv. Mapp. 2018, 43, 83–87. [Google Scholar]
  51. Jackson, T.; Shenkin, A.; Wellpott, A.; Calders, K.; Origo, N.; Disney, M.; Fourcaud, T. Finite element analysis of trees in the wind based on terrestrial laser scanning data. Agric. For. Meteorol. 2019, 265, 137–144. [Google Scholar] [CrossRef]
  52. Moritani, R.; Kanai, S.; Date, H.; Watanabe, M.; Nakano, T.; Yamauchi, Y. Cylinder-based Efficient and Robust Registration and Model Fitting of Laser-scanned Point Clouds for As-built Modeling of Piping Systems. Proc. Cad 2019, 16, 396–412. [Google Scholar] [CrossRef]
  53. Guo, C.; Zong, Z.; Zhang, X.; Liu, G. Apple tree canopy geometric parameters acquirement based on 3D point clouds. Trans. Chin. Soc. Agric. Eng. 2017, 33, 175–181. [Google Scholar]
  54. Feng, J.; Ma, X.; Guan, H.; Zhu, K.; Yu, S. Calculation method of soybean plant height based on depth information. Acta Optica Sinica. pp. 1–18. Available online: http://kns.cnki.net/kcms/detail/31.1252.O4. 20190225.0920.030.html (accessed on 1 March 2019).
  55. Nieves-Chinchilla, J.; Martínez, R.; Farjas, M.; Tubio-Pardavila, R.; Cruz, D.; Gallego, M. Reverse engineering techniques to optimize facility location of satellite ground stations on building roofs. Autom. Constr. 2018, 90, 156–165. [Google Scholar] [CrossRef]
  56. Azzari, G.; Goulden, M.; Rusu, R. Rapid characterization of vegetation structure with a Microsoft Kinect sensor. Sensors 2013, 13, 2384–2398. [Google Scholar] [CrossRef] [PubMed]
  57. Xia, C.; Wang, L.; Chung, B.; Lee, J. In situ 3D segmentation of individual plant leaves using a RGB-D camera for agricultural automation. Sensors 2015, 15, 20463–20479. [Google Scholar] [CrossRef] [PubMed]
  58. Rueda-Ayala, V.P.; Peña, J.M.; Höglind, M.; Bengochea-Guevara, J.M.; Andújar, D. Comparing UAV-Based technologies and RGB-D reconstruction methods for plant height and biomass monitoring on grass ley. Sensors 2019, 19, 535. [Google Scholar] [CrossRef] [PubMed]
  59. Yang, L.; Zhang, L.; Dong, H.; Alelaiwi, A.; Saddik, A.E. Evaluating and improving the depth accuracy of Kinect for Windows v2. Ieee Sens. J. 2015, 15, 4275–4285. [Google Scholar] [CrossRef]
  60. Ma, X.; Liu, G.; Feng, J.; Zhou, W. Multi-source image registration for canopy organ of apple trees in mature period. Trans. Chin. Soc. Agric. Mach. 2014, 45, 82–88. [Google Scholar]
  61. Zhou, W.; Liu, G.; Ma, X.; Feng, J. Study on multi-image registration of apple tree at different growth stages. Acta Opt. Sin. 2014, 34, 177–183. [Google Scholar]
  62. Fukuda, T.; Ji, Y.; Umeda, K. Accurate range image generation using sensor fusion of TOF and Stereo-based Measurement. In Proceedings of the 12th France-Japan and 10th Europe-Asia Congress on Mechatronics, Tsu, Japan, 10–12 September 2018; pp. 67–70. [Google Scholar]
  63. Hu, C.; Pan, Z.; Li, P. A 3D point cloud filtering method for leaves based on manifold distance and normal estimation. Remote Sens. 2019, 11, 198. [Google Scholar] [CrossRef]
  64. Itakura, K.; Hosoi, F. Estimation of Leaf Inclination Angle in Three-Dimensional Plant Images Obtained from Lidar. Remote Sens. 2019, 11, 344. [Google Scholar] [CrossRef]
  65. Roosjen, P.P.J.; Brede, B.; Suomalainen, J.M.; Bartholomeus, H.M.; Kooistra, L.; Clevers, J.G. Improved estimation of leaf area index and leaf chlorophyll content of a potato crop using multi-angle spectral data—Potential of unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 14–26. [Google Scholar] [CrossRef]
  66. Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB, color-Infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
  67. Zou, X.; Ttus, M.M.; Tammeorg, P.; Torres, C.L.; Takala, T.; Pisek, J.; Torres, C.L.; Takala, T.; Pisek, J.; Pellikka, P. Photographic measurement of leaf angles ind fiel crops. Agric. For. Meteorol. 2014, 184, 137–146. [Google Scholar] [CrossRef]
  68. Aminifard, M.H.; Bayat, H.; Khayyat, M. Individual modelling of leaf area in cress and radish using leaf dimensions and weight. J. Hortic. Postharvest Res. 2019, 2, 83–94. [Google Scholar]
  69. Dong, T.; Liu, J.; Shang, J.; Qian, B.; Ma, B.; Kovacs, J.M.; Shi, Y. Assessment of red-edge vegetation indices for crop leaf area index estimation. Remote Sens. Environ. 2019, 222, 133–143. [Google Scholar] [CrossRef]
  70. Larrinaga, A.; Brotons, L. Greenness Indices from a Low-Cost UAV Imagery as Tools for Monitoring Post-Fire Forest Recovery. Drones 2019, 3, 6. [Google Scholar] [CrossRef]
  71. Sanz, R.; Llorens, J.; EscolÀ, A.; Arnó, J.; Planas, S.; Román, C.; Rosell-Polo, J.R. LIDAR and non-LIDAR-based canopy parameters to estimate the leaf area in fruit trees and vineyard. Agric. For. Meteorol. 2018, 260–261, 229–239. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of the data acquisition process. (1) Maize plant; (2) Scan range; (3) Wand; (4) Laser scanning line; (5) The receiver; (6) The transmitter; (7) Processing unit; (8) Laptop.
Figure 1. Schematic diagram of the data acquisition process. (1) Maize plant; (2) Scan range; (3) Wand; (4) Laser scanning line; (5) The receiver; (6) The transmitter; (7) Processing unit; (8) Laptop.
Sensors 19 01201 g001
Figure 2. Flow chart for calculating the phenotypic traits of maize plants.
Figure 2. Flow chart for calculating the phenotypic traits of maize plants.
Sensors 19 01201 g002
Figure 3. Characteristic region classification of three-dimensional (3D) point clouds of maize. Red points indicate the single region, other points represent rich region and are needed to be denoised by bilateral filtering algorithm.
Figure 3. Characteristic region classification of three-dimensional (3D) point clouds of maize. Red points indicate the single region, other points represent rich region and are needed to be denoised by bilateral filtering algorithm.
Sensors 19 01201 g003
Figure 4. Calculation method for plant height based on the fitting sphere.
Figure 4. Calculation method for plant height based on the fitting sphere.
Sensors 19 01201 g004
Figure 5. Calculation method for stem diameter based on the fitting cylinder.
Figure 5. Calculation method for stem diameter based on the fitting cylinder.
Sensors 19 01201 g005
Figure 6. Calculation method for canopy breadth based on the fitting sphere.
Figure 6. Calculation method for canopy breadth based on the fitting sphere.
Sensors 19 01201 g006
Figure 7. Maize plants under the pot cultivation method. (a) A representative sample set of maize plants (b) An individual plant.
Figure 7. Maize plants under the pot cultivation method. (a) A representative sample set of maize plants (b) An individual plant.
Sensors 19 01201 g007
Figure 8. Raw 3D data of maize from the trefoil stage to the jointing stage. (a) A maize plant at the trefoil stage; maize plants at the jointing stage, including (b) a maize plant with 4 leaves, (c) a maize plant with 5 leaves, (d) a maize plant with 6 leaves, (e) a maize plant with 7 leaves, and (f) a maize plant with 8 leaves.
Figure 8. Raw 3D data of maize from the trefoil stage to the jointing stage. (a) A maize plant at the trefoil stage; maize plants at the jointing stage, including (b) a maize plant with 4 leaves, (c) a maize plant with 5 leaves, (d) a maize plant with 6 leaves, (e) a maize plant with 7 leaves, and (f) a maize plant with 8 leaves.
Sensors 19 01201 g008
Figure 9. Example of the point cloud simplification effect in maize plants.
Figure 9. Example of the point cloud simplification effect in maize plants.
Sensors 19 01201 g009
Figure 10. Overall effect after simplification in maize from the trefoil stage to the jointing stage. (a-1,a-2) A maize plant at the trefoil stage (the number of 3D cloud points was simplified from 10,256 to 7401); maize plants at the jointing stage, including (b-1,b-2) a maize plant with 4 leaves (the number of 3D cloud points was simplified from 19,587 to 14,521); (c-1,c-2) a maize plant with 5 leaves (the number of 3D cloud points was simplified from 26,120 to 19,854); (d-1,d-2) a maize plant with 6 leaves (the number of 3D cloud points was simplified from 30,015 to 23,025); (e-1,e-2) a maize plant with 7 leaves (the number of 3D cloud points was simplified from 39,542 to 29,525); (f-1,f-2) a maize plant with 8 leaves (the number of 3D cloud points was simplified from 43,957 to 32,019).
Figure 10. Overall effect after simplification in maize from the trefoil stage to the jointing stage. (a-1,a-2) A maize plant at the trefoil stage (the number of 3D cloud points was simplified from 10,256 to 7401); maize plants at the jointing stage, including (b-1,b-2) a maize plant with 4 leaves (the number of 3D cloud points was simplified from 19,587 to 14,521); (c-1,c-2) a maize plant with 5 leaves (the number of 3D cloud points was simplified from 26,120 to 19,854); (d-1,d-2) a maize plant with 6 leaves (the number of 3D cloud points was simplified from 30,015 to 23,025); (e-1,e-2) a maize plant with 7 leaves (the number of 3D cloud points was simplified from 39,542 to 29,525); (f-1,f-2) a maize plant with 8 leaves (the number of 3D cloud points was simplified from 43,957 to 32,019).
Sensors 19 01201 g010aSensors 19 01201 g010b
Figure 11. Algorithm comparison for the denoising effect. (a) Raw data; (b) denoising effect of the Laplace algorithm; (c) denoising effect of the bilateral filtering algorithm.
Figure 11. Algorithm comparison for the denoising effect. (a) Raw data; (b) denoising effect of the Laplace algorithm; (c) denoising effect of the bilateral filtering algorithm.
Sensors 19 01201 g011
Figure 12. Overall effect of the denoising effect in maize from the trefoil stage to the jointing stage. (a-1,a-2) A maize plant at the trefoil stage; maize plants at the jointing stage, including (b-1,b-2) a maize plant with 4 leaves, (c-1,c-2) a maize plant with 5 leaves, (d-1,d-2) a maize plant with 6 leaves, (e-1,e-2) a maize plant with 7 leaves, and (f-1,f-2) a maize plant with 8 leaves. Red points in the images before denoising indicate the noise points.
Figure 12. Overall effect of the denoising effect in maize from the trefoil stage to the jointing stage. (a-1,a-2) A maize plant at the trefoil stage; maize plants at the jointing stage, including (b-1,b-2) a maize plant with 4 leaves, (c-1,c-2) a maize plant with 5 leaves, (d-1,d-2) a maize plant with 6 leaves, (e-1,e-2) a maize plant with 7 leaves, and (f-1,f-2) a maize plant with 8 leaves. Red points in the images before denoising indicate the noise points.
Sensors 19 01201 g012aSensors 19 01201 g012bSensors 19 01201 g012c
Figure 13. Effectiveness of the calculation method for phenotypic traits.
Figure 13. Effectiveness of the calculation method for phenotypic traits.
Sensors 19 01201 g013aSensors 19 01201 g013b
Figure 14. 3D point clouds acquired by Kinect v 2.0 sensor.
Figure 14. 3D point clouds acquired by Kinect v 2.0 sensor.
Sensors 19 01201 g014
Table 1. Simplification rate of maize from the 3-leaves stage to the jointing stage.
Table 1. Simplification rate of maize from the 3-leaves stage to the jointing stage.
3-Leaves4-Leaves5-Leaves6-Leaves7-Leaves8-Leaves
Raw data10,25619,58726,12030,01539,54243,957
After simplification740114,52119,85423,02529,52532,019
rate (%)27.825.924.023.325.327.1
Table 2. Comparison of algorithm performance.
Table 2. Comparison of algorithm performance.
Maximum Error/mmAverage Error/mm
Laplace filtering19.742.54
Bilateral filtering algorithm15.571.57
Table 3. Comparison of accuracy for the two devices.
Table 3. Comparison of accuracy for the two devices.
SensorsPlant HeightStem DiameterCanopy Breadth
FastSCANR2 = 0.9807R2 = 0.8907R2 = 0.9562
Kinect v2.0R2 = 0.9858R2 = 0.6973R2 = 0.8854
Table 4. Advantages and limitations of common sensors implemented in acquiring of geometric traits.
Table 4. Advantages and limitations of common sensors implemented in acquiring of geometric traits.
SensorsDistance of Point-to-PointAdvantagesLimitations
Stereo vision systemVarious resolutionsLow cost
Suitable for unmanned aerial vehicles (UAV)
Heavy computation
Sensitive to strong light
Lidar/laser sensor
(e.g., Trimble TX8)
7.5 mm at 30 mLong measurement range
High resolution
High cost
Limited information on occlusions and shadows
Range camera
(e.g., Kinect v 2.0)
>4.0 mm at more than 4 mLow cost
High frame rate
Sensitive to strong light
Low resolution
Hand held laser scanner
(e.g., FastSCAN)
0.178 mm in the range of 200 mmHigh resolution
High accuracy for 3D model
Short measurement range
Hand held

Share and Cite

MDPI and ACS Style

Ma, X.; Zhu, K.; Guan, H.; Feng, J.; Yu, S.; Liu, G. Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies. Sensors 2019, 19, 1201. https://doi.org/10.3390/s19051201

AMA Style

Ma X, Zhu K, Guan H, Feng J, Yu S, Liu G. Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies. Sensors. 2019; 19(5):1201. https://doi.org/10.3390/s19051201

Chicago/Turabian Style

Ma, Xiaodan, Kexin Zhu, Haiou Guan, Jiarui Feng, Song Yu, and Gang Liu. 2019. "Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies" Sensors 19, no. 5: 1201. https://doi.org/10.3390/s19051201

APA Style

Ma, X., Zhu, K., Guan, H., Feng, J., Yu, S., & Liu, G. (2019). Calculation Method for Phenotypic Traits Based on the 3D Reconstruction of Maize Canopies. Sensors, 19(5), 1201. https://doi.org/10.3390/s19051201

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop