Next Article in Journal
STAIR 2.0: A Generic and Automatic Algorithm to Fuse Modis, Landsat, and Sentinel-2 to Generate 10 m, Daily, and Cloud-/Gap-Free Surface Reflectance Product
Next Article in Special Issue
Wavelength Selection Method Based on Partial Least Square from Hyperspectral Unmanned Aerial Vehicle Orthomosaic of Irrigated Olive Orchards
Previous Article in Journal
InSAR Greece with Parallelized Persistent Scatterer Interferometry: A National Ground Motion Service for Big Copernicus Sentinel-1 Data
Previous Article in Special Issue
Automatic Grapevine Trunk Detection on UAV-Based Point Cloud
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network

1
College of Engineering, China Agricultural University, Beijing 100083, China
2
College of Water Resources & Civil Engineering, China Agricultural University, Beijing 100083, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(19), 3208; https://doi.org/10.3390/rs12193208
Submission received: 5 September 2020 / Revised: 25 September 2020 / Accepted: 28 September 2020 / Published: 1 October 2020
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)

Abstract

:
Because of low accuracy and density of crop point clouds obtained by the Unmanned Aerial Vehicle (UAV)-borne Light Detection and Ranging (LiDAR) scanning system of UAV, an integrated navigation and positioning optimization method based on the grasshopper optimization algorithm (GOA) and a point cloud density enhancement method were proposed. Firstly, a global positioning system (GPS)/inertial navigation system (INS) integrated navigation and positioning information fusion method based on a Kalman filter was constructed. Then, the GOA was employed to find the optimal solution by iterating the system noise variance matrix Q and measurement noise variance matrix R of Kalman filter. By feeding the optimal solution into the Kalman filter, the error variances of longitude were reduced to 0.00046 from 0.0091, and the error variances of latitude were reduced to 0.00034 from 0.0047. Based on the integrated navigation, an UAV-borne LiDAR scanning system was built for obtaining the crop point. During offline processing, the crop point cloud was filtered and transformed into WGS-84, the density clustering algorithm improved by the particle swarm optimization (PSO) algorithm was employed to the clustering segment. After the clustering segment, the pre-trained Point Cloud Up-Sampling Network (PU-net) was used for density enhancement of point cloud data and to carry out three-dimensional reconstruction. The features of the crop point cloud were kept under the processing of reconstruction model; meanwhile, the density of the crop point cloud was quadrupled.

1. Introduction

Light Detection and Ranging (LiDAR) is an active sensing technology that can quickly acquire spatial information of a target or environment [1,2]. Compared with traditional remote sensing methods, LiDAR has shown great advantages in accuracy and stability [3]. In agriculture, the LiDAR system is one of the most accurate methods to measure regional vegetation structural characteristics and biophysical parameters [4]. The Unmanned Aerial Vehicle (UAV)-borne LiDAR scanning system is one of the most common platforms for crop point cloud obtaining. The crop point clouds captured by UAV-borne LiDAR scanning system are sparse and unordered [5]. The navigation and positioning accuracy of the UAV-borne LiDAR scanning system influences the accuracy of the point cloud data greatly. Density is also an important indicator to measure the quality of point cloud data. Higher point cloud density represents richer information of the target or environment [6]. Ensuring the accuracy and density characteristics of the UAV-borne LiDAR point cloud data while maintaining the cost is crucial for the continuous and efficient execution of LiDAR detection tasks.
For integrated navigation, the most widely used in the world is the global positioning system (GPS)/inertial navigation system (INS) integrated navigation positioning system [7]. The Kalman filter is the most common method of data fusion [8]. Gao et al. [9] developed an INS/GPS/LiDAR integrated navigation system through an extended Kalman filter (EKF) using a hybrid scan matching algorithm. Shabani et al. [10] explored the characteristics, advantages, and disadvantages of direct Kalman filtering and indirect Kalman filtering commonly used in GPS/INS integrated navigation, and a novel asynchronous direct Kalman filter was proposed. Zhang et al. [11] developed a low-cost global navigation satellite system (GNSS) and INS integrated navigation method and an adaptive Kalman filter for overcoming the measurement noise of GNSS, which was proposed based on a supervised machine learning model. Yu et al. [12] used a single-frequency real-time kinematic carrier phase (RTK) technology to deduce and obtain information including speed and orientation under GPS/BeiDou Navigation Satellite System (BDS) for polishing the inaccurate INS information. Norouz et al. [13] proposed an improved Kalman filter algorithm based on small probability distribution of the state variable between the measurement updates of the multi-rate GPS/INS-coupled navigation filter, which greatly reduced the amount of calculation, and the algorithm performed well by numerically simulating. All of the above integrated navigation systems suffered from the low accuracy of integrated navigation, except Yu et al. [12]. However, RTK-based high accuracy integrated navigation systems cannot be widely popularized due to the price gap. During the experiments of data fusion using the Kalman filter, the longitude and latitude errors were sharply affected by the parameter of the Kalman filter, especially system noise sequence Q and measurement noise sequence R, and the same situation also occurred in the realization of Ref. [10,11]. For promoting the accuracy of our integrated navigation-based scanning system, the meta-heuristic method was employed for finding the optimum parameters of integrated navigation in our UAV scanning system. For multi-solution problems, genetic algorithms (GA) [14], particle swarm optimization (PSO) [15], and ant colony optimization (ACO) [16] were the most popular, with swarm intelligence optimization proposed recently such as the bat algorithm (BA) [17] and grasshopper optimization algorithm (GOA) [18]. All of the multi-solution meta-heuristic methods have their own merits for different problems [19]. In order to reduce longitude and latitude errors of integrated navigation, the meta-heuristic methods were employed to find the optimum system noise sequence Q and measurement noise sequence R.
After obtaining the crop point cloud data by UAV scanning system, the raw point cloud data consisted of crop points and other points, and manual classification was extremely inefficient for different point cloud data. Therefore, the clustering segmentation method was employed to extract crop data from raw point cloud data. The density-based clustering method, density-based spatial clustering of applications with noise (DBSCAN), has attracted a lot of attention; besides the improvement of the algorithm itself, finding out the key parameters of clustering was also a new direction of optimization [20]. The same as integrated navigation, the meta-heuristic method was employed to optimize the clustering segmentation method. According to the experimental results of the density distribution characteristics of the plane point cloud under different conditions, Kedzierski et al. [21] found that the point cloud density could be preliminarily estimated based on distance and angle, but the actual point cloud density should take into account the influence of object shape and environment. Rupink et al. [22] proposed the formula and calculation steps for calculating the point cloud density. Huang et al. [23] proposed an edge-sensing point set resampling method, gradually approaching the edges and corners. However, the quality of the results largely depends on the accuracy of the normal of a given point and the fine-tuning of parameters. Based on Ref. [21,23], dense and accurate point clouds are very useful for crop point cloud recognition and 3D reconstruction. The high-cost multi-line LiDAR can be simulated by low-cost single-line LiDAR point clouds through the point cloud up-sampling method without destroying the features.
In view of the current problems and inspired by existing methods, in order to improve the quality of point cloud data, a combined navigation optimization algorithm based on GOA and a point cloud density enhancement algorithm based on Point Cloud Up-Sampling Network (PU-net) were proposed. The novel contribution of this article was as follows:
(1)
In order to obtain the optimal solution of the initial parameters of the Kalman filter quickly and accurately, GOA took the sum of the longitude and latitude error variances as the fitness function to find the optimal solution to iterate the system noise variance matrix Q and the measurement noise variance matrix R. Based on the Q and R matrices from GOA, the positioning error of the UAV-borne LiDAR scanning system was greatly reduced; meanwhile, the computing burden was decreased, and the accuracy of point cloud data acquisition was improved;
(2)
The density clustering algorithm was improved by the linear decreasing particle swarm optimization (PSO) algorithm. In order to solve the problem of sparse density, in this paper, a pre-trained PU-net was used for enhancing the point cloud density of the point cloud data obtained by UAV-borne LiDAR. Experiments have verified that the enhanced point cloud data retain the features of the raw point cloud data.

2. Framework of the Proposed Method

For UAV-borne LIDAR point cloud data enhancement, the integrated navigation algorithm based on GOA and the point cloud density enhancement algorithm based on neural network were shown in Figure 1. The specific implementation steps were as follows:
(1)
Firstly, the GPS/INS integrated navigation information fusion framework was built by the Kalman filter. Then, the GOA was used to optimize the initial parameters of the Kalman filter, and the optimal solution was returned to the Kalman filter to estimate the output state of the GPS/INS integrated navigation system;
(2)
After data acquisition, the point cloud data was transformed into the WGS-84 coordinate system from the LiDAR coordinate system. Then the point cloud data was filtered for removing the irrelevant points. Finally, the two-dimensional point cloud data was transformed into a three-dimensional point cloud according to the flight trajectory of UAV;
(3)
The PSO algorithm was employed to optimize the density clustering algorithm for crop point cloud segment. Then the crop point cloud data was fed into the trained network to obtain the crop point cloud data after the point cloud density was enhanced. Finally, the point cloud data was reconstructed in three dimensions.

3. Integrated Navigation Enhancement Based on GOA

In GPS/INS integrated navigation, the Kalman filter was usually used for fusing information of GPS and INS. The better the filter of integrated navigation was, the better it can integrate the advantages of each sensor. Since the GOA is not affected by the nonlinearity or scale, compared with other global optimization algorithms, it can find more effective optimal solutions with faster convergence speed [24].

3.1. Framework of GPS/INS Integrated Navigation

GPS and INS have complementary advantages. By combining GPS and INS in an appropriate way, their own drawbacks could be overcome, and the navigation accuracy would significantly improve. In order to improve the robustness of the system and reduce the complexity of the system, this paper adopted the loose combination mode to fuse the information of GPS and INS, as shown in Figure 2.
The state variables of the integrated navigation filter were composed of the random drift error, velocity error, and position error of INS gyroscope, as well as errors caused by the attitude angle and accelerometer. The equation of the state was as follows:
X ^ = F X + G w
The state variables of the INS integrated navigation system include longitude error, latitude error, velocity error in east and north directions, attitude angle error in east, north, and sky directions, random drift, and accelerometer bias error. The 13-dimensional equation of the state was as follows:
X ^ I = F I X I + G I ω I
The clock error of the receiver equivalent distance δ t u and equivalent range error of the receiver clock frequency error δ t r u were the GPS state errors; the equation of the state can be described as follows:
X ^ G = F G X G + G G w G
The measurement equation of the position velocity combination was expressed as follows:
[ L I N S λ I N S h I N S ] = [ L t + δ L λ t + δ λ h t + δ h ]
where L I N S , λ I N S , and h I N S were the longitude, latitude, and altitude calculated by inertial navigation. L t , λ t , and h t were the real longitude, latitude, and height of the carrier. δ L , δ λ , and δ h were the longitude and latitude error and altitude error calculated by INS.
The position measurement information of the GPS receiver was expressed by subtracting the corresponding error from the real value:
[ L G P S λ G P S h G P S ] = [ L t N n R λ t N e R c o s L h t N u ]
where N n , N e , and N u were the position error of the GNSS receiver along the east, north, and sky directions. Then the external measurement position error can be defined as:
Z p o s = [ λ I N S λ G P S L I N S L G P S h I N S h G P S ]
The position measurement equation was as follows:
Z P O S = H P O S ( t ) X ( t ) + V P O S ( t )
The velocity measurement information equation of INS can be expressed as follows:
[ v e I N S v n I N S v u I N S ] = [ v e + δ v e v n + δ v n v u + δ v u ]
where v e I N S , v n I N S , and v u I N S were the east speed, north speed, and sky speed calculated by the inertial navigation system. v e , v n , and v u were the velocity of the object in the east, north, and sky directions.
The velocity information provided by GPS can be expressed by the difference between the real value and the corresponding error in the geographical coordinate system:
[ v e G P S v n G P S v u G P S ] = [ v e M e v n M n v u M u ]
The speed error of the external measurement can be defined as:
Z v e l = [ v e I N S v e G P S v n I N S v n G P S v u I N S v u G P S ]
The velocity measurement equation was as follows:
Z v e l = H v e l ( t ) X ( t ) + V v e l ( t )
The integrated measurement equation of the GPS/INS integrated navigation system was obtained as follows:
Z ( t ) = [ λ I N S λ G P S L I N S L I N S h I N S h I N S v e I N S v e G P S v n I N S v n G P S v u I N S v u I N S ] = H ( t ) X ( t ) + V ( t )

3.2. Grasshopper Optimization Algorithm (GOA)

GOA is a swarm intelligence optimization algorithm designed by Australian scholar Shahrzad [18] in 2017, simulating the predatory migration behaviors of the grasshopper population in nature. The main characteristics of larval stage grasshoppers were slow movement and small stride. In contrast, long distances and sudden movements were essential features of adult grasshoppers. Searching for food sources was another important feature of the grasshopper swarm. The mathematical model of grasshopper swarm behavior was as follows:
X i = j = 1 j i N s ( | x j x i | ) x j x i d i j g e ^ g + u e ^ w
where X i represented the position of i t h grasshopper. N represents the amount of the grasshopper swarm. s was interaction force between grasshoppers. d i j was the distance between the i t h grasshopper and j t h grasshopper. The function of the interaction force (attraction and repulsion shown in Figure 3.) s between grasshoppers was defined as follows:
s ( d ) = f e d l e d
where d was the distance between two grasshoppers. f represented the interaction force (attraction and repulsion), l represented the unit of interaction force length. The modified form of Equation (10) was as follows:
X i d i m = c r ( j = 1 j i N c u b d i m l b d i m 2 A x j x i d i j ) + T ^ d i m
A = s ( | x j d x i d | )
where u b d i m was the upper boundary of the d i m t h dimension, l b d i m was the lower boundary, and T ^ d i m was the target value of the d i m t h dimension. c represented a self-adaption ratio which is defined as follows:
c = c m a x l c m a x c m i n L
where c m a x was the max value, while c m i n was the minimum value; l represented the current iteration, while L is the max iteration.

3.3. Kalman Filter Optimization Using GOA

GOA was employed to optimize the initial parameters Q and R of the Kalman filter. Q was the variance matrix of the system noise sequence, which affected the filtering performance and parameter estimation accuracy of the Kalman filter algorithm; R was the variance matrix of the measurement noise sequence, which was related to the correction speed of filtering and the stability of the filtering process. Through the optimization of the initial parameters, the optimal position of the grasshopper was as close to the real value as possible, thus the accuracy of the Kalman filter algorithm was improved greatly. The sum of the variances of longitude and latitude errors was selected as the fitness function. The pipelines of the Kalman filter optimization was shown in Figure 4.

4. UAV-Borne LiDAR System Construction and Point Cloud Data Processing

The UAV-borne LiDAR system was mainly composed of the LiDAR sensor, GPS/INS integrated navigation module, and UAV carrier. After the point cloud data were collected by the UAV-borne LiDAR scanning system, it was necessary to transform the point cloud data from the LiDAR coordinate system to the WGS-84 coordinate system.

4.1. UAV-Borne LiDAR System

In view of the application of LiDAR in the agricultural field, this paper selected RPLiDAR-a2 of SLAMTEC®, which was a single-line two-dimensional LiDAR, which greatly reduced the cost of equipment. The WitMotion® GPS/ Inertial Measurement Unit (IMU) was selected to collect the navigation and positioning information of UAV. In order to supply electricity to the LiDAR and the IMU, a small pad was used to combine the system. DJI® M100 UAV was selected as the carrier, which was a lightweight UAV. Finally, the UAV-borne LiDAR scanning system was completed, as shown in Figure 5 and Figure 6.
Due to the single scanning line of RPLiDAR-a2, the collected point cloud data was a two-dimensional point cloud, which only contained the data of Y and Z coordinates. In order to generate three-dimensional point cloud data, according to the light trajectory and speed of UAV, the data of the X-axis were inversely solved according to IMU.

4.2. Coordinate Transformation of Point Cloud Data

It was assumed that the coordinate of the laser foot point in the instantaneous laser coordinate system was ( x S L , y S L , z S L ) T :
[ x S L y S L z S L ] = [ 0 0 ρ ]
There was an angle θ between the instantaneous laser coordinate system and the laser scanning reference coordinate system. The coordinates of the raw point cloud data in the laser scanning reference coordinate system were ( x L , y L , z L ) T :
[ x L y L z L ] = R L [ x S L y S L z S L ]
where R L represented the rotation matrix of scan angle θ .
Generally, the error between the different coordinate axes was analyzed and unified to be the placement error angle α , β , and γ . The deviation of the coordinate origin and reference center was treated as error t L = ( Δ x I L , Δ y I L , Δ z I L ) T . The coordinates of the raw point cloud in the inertial platform reference coordinate system were ( x I , y I , z I ) T :
[ x I y I z I ] = R M [ x L y L z L ] + [ Δ x I L Δ y I L Δ z I L ]
where R M = R ( γ ) × R ( β ) × R ( α ) .
In the integrated navigation system, there were also positioning errors between GPS and INS. During the installation process, there was a certain deviation between the center of GPS and the reference center of inertial components. The deviation was defined as t G = ( Δ x I G , Δ y I G , Δ z I G ) T ; it can be measured offline. INS measured the attitude information of the carrier in real time, including roll angle (R), pitch angle (P), and heading angle (H), and got the velocity and position information of the carrier through integration and feed back to the user in real time. At the same time, the coordinate rotation matrix was obtained by calculating the attitude information feedback from IMU. The coordinates of the raw point cloud data in the local reference coordinate system were ( x L H , y L H , z L H ) T :
[ x L H y L H z L H ] = R N [ x I y I z I ] [ Δ x I G Δ y I G Δ z I G ]
where R N = R ( H ) × R ( P ) × R ( R ) .
Finally, point cloud data should be transformed to WGS-84 coordinate system. The rotation matrix R w was a function of the latitude and longitude of the region. The point cloud could be transformed into the WGS-84 coordinate system by rotating the matrix:
[ x 84 y 84 z 84 ] = R W [ x L H y L H z L H ] + [ x 84 y 84 z 84 ] A n t e n n a   p h a s e   c e n t e r
Based on the above analysis, the coordinates of the raw point cloud in WGS-84 coordinate system were as follows:
[ x 84 y 84 z 84 ] = R W R N [ R M R L [ 0 0 ρ ] + [ Δ x I L Δ y I L Δ z I L ] [ Δ x I G Δ y I G Δ z I G ] ] + [ x 84 y 84 z 84 ] A n t e n n a   p h a s e   c e n t e r
In addition, in order to get the data containing the latitude and longitude information, it was necessary to convert the three-dimensional rectangular coordinates ( X Y Z ) into geodetic coordinates ( λ L H ). The conversion formula was as follows:
L = a r c t a n ( Y / X )
λ = a r c t a n [ Z / X 2 + Y 2 ( 1 e 2 N ( N + H ) ) 1 ]
H = X 2 + Y 2 / c o s λ N
where X , Y , and Z were three-dimensional rectangular coordinate components, L , λ , and H were earth longitude, latitude, and height. e represented the first eccentricity of the ellipsoid. N = a / 1 e 2 s i n 2 λ was the radius of curvature in the prime vertical, and the a was the semimajor axis of the ellipsoid.

5. Point Cloud Density Enhancement Based on PU-Net

Point cloud density enhancement aims to increase the number of point clouds without destroying the characteristics of the original point cloud. By enhancing the density of point cloud, the richness and accuracy of 3D reconstruction of point cloud data can be improved without increasing equipment cost.

5.1. Point Cloud Preprocessing

After the coordinate transformation, it was necessary to remove irrelevant noise points from the original point cloud data. In this paper, the statistical filtering method was selected for filtering. Statistical filtering calculated the distance between each point and its surrounding neighborhood points: According to neighborhood points and the standard deviation threshold, the points which did not meet the requirements were identified as outlier points.

5.2. Point Cloud Density Clustering Segmentation Method Based on PSO Algorithm

After filtering in the previous section, the point cloud data obtained included ground and crop points. In order to separate the ground and crop, the density clustering algorithm was used for separating, and the linear decreasing PSO algorithm was employed to improve the density clustering algorithm.
DBSCAN was a density-based clustering algorithm proposed by ester et al. It was designed to achieve clustering and segmentation of arbitrary data in spatial and non-spatial high-dimensional databases. By calculating the distance between each point and its neighborhood, the points with similar density were classified into the same class.
ε -Neighborhood of arbitrary point p is defined as follows:
N e p s = { q D / d i s t ( p , q ) < E p s }
where D represented the target’s database, point p was the core point if the ε -neighborhood contained at least the minimum number of points. The core point was defined as follows:
N e p s ( P ) > M i n P t s
where E p s and M i n P t s were the neighborhood radius and minimum number of points in the ε -neighborhood of the core point. Through the calculation of all the points, the density of the points meeting the set conditions was similar, and they were classified as the same kind of points. All points were calculated until the termination condition was met. Because the selection of these two initial parameters mainly depended on human experience and continuous experiments, in order to improve the efficiency, this paper employed the linear decreasing PSO algorithm to iteratively optimize the two initial parameters to determine the optimal solution of the initial parameters.
The PSO algorithm was a swarm intelligent bionic optimization algorithm: Every point in PSO was stamped by a position P i , the optimal value of swarm was P g , every point’s velocity was V i , and each particle’s velocity and position update formula was shown as follows:
V i t + 1 = ω V i t + c 1 r 1 ( P i X i t ) + c 2 r 2 ( P g X i t )
X i t + 1 = X i t + V i t + 1
where t represented the iterations, V i t + 1 was the velocity of the t + 1 iteration, ω represented the inertia weight. c 1 , c 2 [ 0 , 1 ] were the generating functions of a random number, and X i t + 1 was the t + 1 iteration position.
In the initial stage of the algorithm, the inertia weight ω was given a larger value to enhance the global exploration ability of the algorithm. With the continuous exploration process, the value of ω could be linearly reduced to enhance the local search and optimization ability of the algorithm, so that particles could gradually approach the optimal solution in the neighborhood of the optimal value. The value of ω was expressed as follows:
ω = ω m i n + ( ω m a x ω m i n ) × ( m a x i t e r t ) m a x i t e r
where ω m a x represented the max inertia coefficient, while ω m i n represented the minimum. m a x i t e r was the max iteration.
PSO optimization pipelines are shown in Figure 7:

5.3. Point Cloud Density

The point cloud data acquired by UAV-borne LiDAR was 3-dimensional, which was usually expressed by plane density. The point cloud density in national standards was defined as the normal direction of elevation density [22]:
ρ = N i = 0 k N i A i = 0 k A i
where ρ represented the point cloud density, and N represented the total amount of points. Furthermore, N i was the i t h field number of points, and k was the number of scanning area. A was the measure of the scanning area.
The point cloud density calculated by the above formula was the average point cloud density in the scanning area. In order to further quantify the point density distribution in the scanning area, the reciprocal of the area affected by a single point was taken as the density of the point:
d = 1 / A
The main factors affecting the density of the point cloud were the LiDAR design parameters, system characteristics, carrier, and scanning area.
The distance between points directly reflected the density of point clouds. For the pulse laser ranging sensor, the distance from the heading point to the edge point determined the density distribution of point cloud. The design of aerial photography parameters was referred from the distance between heading point and edge point. The parameters of the sensor, such as scanning frequency, scanning bandwidth, laser emission frequency, field of view angle, flight height and flight speed, and the distance between point clouds, were considered to design the LiDAR system.

5.4. Density Enhancement Using Point Cloud Up-Sampling Network (PU-Net)

Point cloud up-sampling was essentially similar to the problem of image super-resolution [25]. Yu et al. [26] proposed a data-driven point cloud sampling neural network (PU-net), which was applied to patch level. At the same time, a joint loss function was used for guiding the up-sampled points to retain a uniformly distributed object surface. The main idea of PU-net was capturing the features of each point in every level, then multi-convolution blocks were extended in feature hidden layers, and finally, the hidden layers were divided into multi-level features and up-sampled into the point cloud. The pre-trained PU-net was employed for enhancing the density of the crop point cloud in this paper. There were 4 steps of PU-net for up-sampling of point cloud: Slice extraction, point feature embedding, feature extending, and point coordinate reconstruction. It is noteworthy that a series of sub-pixel convolution layers were employed to extend the features of neighborhood points in the feature extending stage:
f = R e s h a p e ( [ c o n v 1 2 ( c o n v 1 1 ( f ) ) , , c o n v r 2 ( c o n v r 1 ( f ) ) ] )
where c o n v i 1 ( * ) and c o n v i 2 ( * ) were the convolution layer with 1*1 kernel size for adjusting the channels of patch feature f . The output, combined feature f , was born with properties of spatial characteristic which could retain the information of raw data. After coordinate reconstruction, the density elevation in last section was employed to show the enhancement effect directly.

6. Results

In order to verify the stability of the UAV-borne LiDAR scanning system, the experiments were carried out in integrated navigation testing, point cloud density clustering, and point cloud density enhancement.

6.1. GOA-Based Integrated Navigation

The integrated navigation experiment was carried out in MATLAB2017 (a). It was supposed that the initial longitude of UAV L = 4 0 o , the initial latitude λ = 11 6 o , and the velocity of UAV was 1 m / s . The filtering period was T = 1 s . The initial value of filtering covariance was as follows:
p ( 0 ) = d i a g [ ( 0.0025 4 o ) 2 , ( 0.0044 6 o ) 2 , ( 0.1 m / s ) 2 , ( 0.1 m / s ) 2 , ( 1 o ) 2 , ( 1 o ) 2 , ( 1 o ) 2 , ( 1 o ) 2 , ( 1 o ) 2 , ( 1 o ) 2 , ( 10 5 ) g , ( 1 0 5 ) g , ( 1 0 5 ) g ]
The initial value of integrated navigation was:
X ( 0 ) = [ 0 m , 0 m , 0.5 m / s , 0.5 m / s , 0.0 8 o , 0.0 8 o , 0.0 8 o , 0.1 6 o , 0.1 o / h , 0.1 o / h , 0.1 o / h , 0.1 o / h , ( 1 0 5 ) g , ( 1 0 5 ) g , ( 1 0 5 ) g ]
The initial parameters Q and R obtained when the sum of longitude and latitude errors were used as the fitness function feeding into the system. The GOA based integrated navigation error diagram was shown in Figure 8 and Figure 9, PSO based integrated navigation error diagram was shown in Figure 10 and Figure 11.
As shown in Table 1, both the longitude L variance error and latitude λ variance error were reduced greatly compared with raw integrated navigation. It is worth noting that the GOA-based integrated navigation showed better performance than the PSO-based integrated navigation; it is suspected that the ability to find the optimal of GOA was better than for PSO in the integrated navigation task. For more precise optimal Q and R, the GOA-based integrated navigation was employed for the UAV-borne LiDAR scanning system.

6.2. PSO-Based Point Cloud Clustering Segmentation

The meta-heuristic algorithm was employed to optimizing the clustering segmentation key elements: E p s and M i n P t s . The Davies–Bouldin index (DBI) was employed as the fitness function for the joint optimization of E p s and M i n P t s :
D B I = 1 k i = 1 k m a x j i ( a v g ( C i ) + a v g ( C j ) d c e n ( μ i , μ j ) )
where a v g ( ) represented the average distance between samples in cluster, d c e n ( μ i , μ j ) was the center distance of cluster μ i , and cluster μ j . a v g ( * ) was defined as follows:
a v g ( C ) = 2 | C | ( | C | 1 ) 1 i j | C | d i s t ( x i , x j )
where d i s t ( x i , x j ) was the distance of sample x i and sample x j .
Crop point cloud data collection environment was shown in Figure 12.
A series of crop point cloud raw data was collected from the UAV-borne LiDAR scanning system; the meta-heuristic-based DBSCAN algorithms were initialized by crop raw data and iterated 200 times to find the optimal Eps and MinPts. The fitness iteration curve is shown in Figure 13.
For fair comparing with different meta-heuristic algorithms, the BA-based DBSCAN algorithm [17], GA-based DBSCAN algorithm [14], GOA-based DBSCAN algorithm [18], and PSO-based DBSCAN algorithm [15] were initialized with 20 population-sizes, 2 dimensions, and 200 iterations. In particular, the GA-based DBSCAN algorithm was initialized with a 0.7 crossover rate, 0.5 select rate, and 0.01 variation rate. During several times of testing, Figure 13 shows that the GOA and PSO found the best fitness 0.416032. It is suspected that global optimization was 0.416032 under further testing with different fine tuning. Under the same fitness 0.416032, the Eps = 0.8147 and MinPts = 13. In the clustering segmentation, the BA-based DBSCAN algorithm, GA-based DBSCAN algorithm, GOA-based DBSCAN algorithm, PSO-based DBSCAN algorithm, and the K-means clustering were employed for comparison; the K-means was the default tuning. The 200 iterations of time consumption of the BA-based DBSCAN algorithm, GA-based DBSCAN algorithm, GOA-based DBSCAN algorithm, and PSO-based DBSCAN algorithm were 9669 s, 910 s, 1610 s, and 749 s, respectively. The comparison clustering results are shown in Figure 14a–e. Shown in Figure 14f, the crop point cloud is reconstructed in 3-dimensions.
As shown in Figure 14a–c, 3 kinds of clustering results were given from the default tuning of K-means, and the ground cluster could not be separated from the crop cluster using the BA-based DBSCAN algorithm or GA-based DBSCAN algorithm. Under the same fitness 0.416032, the Eps = 0.8147 and MinPts = 13 were obtained using the GOA-based DBSCAN algorithm and PSO-based DBSCAN algorithm. Considering that PSO converges faster and takes less time in iterations, the PSO-based DBSCAN algorithm was employed for crop point cloud clustering segmentation.

6.3. Density Enhancement with PU-Net

After the clustering segmentation of crop point cloud, the crop point cloud was enhanced by a pre-trained PU-net. The PU-net was trained with 60 sets of variation of point clouds from “Visionair”. Datasets were divided into train sets and validation sets by 60% and 40%. In the fine-tune training period, the Monte Carlo stochastic sampling method was employed for sampling 5000 points from every object as the input. Each input represented the global information of each object; fine-tune training aimed at enhancing the global information collection performance of the pretrained model. The visualization of raw crop point cloud data was shown in Figure 15. The sparse part of the point cloud was mainly concentrated on the X-axis, which was the direction of the UAV-borne LiDAR scanning system to generate the 3D point cloud by moving forward.
The raw crop point cloud was fed into the fine-tune trained PU-net. The 3D reconstruction of the crop point cloud with and without density enhancement is shown in Figure 16.
The total number of points of the raw crop point cloud were 10,000, the density of the raw crop point cloud was 1250 points per square meter. After density enhancement, the total number of points of the crop point cloud were 40,000, the density of the crop point cloud was 5000 points per square meter, and the number and density of the point cloud had been significantly increased. Compared with the raw crop point cloud, the enhanced crop point cloud contained more information, and also retained the shape and contour characteristics of the raw data. In order to further verify the reliability and effectiveness of the method, 50 sets of density enhancement tests were carried out and 5 sets of results were sampled randomly for statistical tests. The results were shown in Table 2.

7. Discussion

For integrated navigation testing, because the GOA and PSO showed better performance in fitness convergence velocity in the local field refer to Ref. [19]. Although this was an unfavorable feature for global optimization, for time varying and unpredictable problems, such as the integrated navigation system in this paper, better performance in the fitness convergence velocity in the local field tended to find the suboptimal solution quickly, which would reduce the error sharply. Shown as the GOA based integrated navigation error diagram (Figure 8 and Figure 9) and the PSO based integrated navigation error diagram (Figure 10 and Figure 11), both the longitude L variance error and latitude λ variance error were reduced greatly compared with raw integrated navigation. It is worth noting that the GOA-based integrated navigation showed better performance than the PSO-based integrated navigation; it is suspected that the ability to find the optimal of GOA was better than for PSO in the integrated navigation task. For more precise optimal Q and R, the GOA-based integrated navigation was employed for the UAV-borne LiDAR scanning system.
For point cloud density clustering segmentation, default-turning K-means clustering divided crop into two clusters and separates ground points clearly, but parts of data points were ignored for clustering. Different meta-heuristic-based DBSCAN algorithms shown different optimization capabilities according to Figure 13. GA algorithm was the weakest in point cloud density clustering, different variation rates and 800 iterations more were set in further fine-turning, the GA-based DBSCAN algorithms still could not find the lower fitness value which was close to 0.416032. The BA-based DBSCAN algorithm shown the better optimization capability but lower computing efficiency, 9669 s of time consumption make it worse for practical application. Both GOA-based DBSCAN algorithm and PSO-based DBSCAN algorithm found the same fitness 0.416032. It is suspected that global optimization was 0.416032 under further testing with different fine tuning. Considering that PSO converges faster and takes less time in iterations, the PSO-based DBSCAN algorithm was employed for crop point cloud clustering segmentation.
For point cloud density enhancement, the pre-trained PU-net [26] was trained by “Visionair” datasets, there were many objects point cloud dataset for learning features by Convolutional Neural Network (CNN). After point cloud density enhancement, the density of point cloud was greatly enhanced, and Table 2 shown that there was no gap in mean value between the raw crop point cloud and 5 sets of the enhanced crop point cloud, which indicated that the enhanced point cloud kept the height distribution of the raw data. A faint change of variance and standard deviation indicated that the 5 sets of the enhanced point cloud also kept the shape and contour of the raw data.

8. Conclusions

This paper developed a UAV-borne LiDAR scanning system with enhanced integrated navigation and dense point cloud optimization. The GOA was employed for optimizing the integrated navigation; the longitude and latitude variances of optimized integrated navigation were 0.00046 and 0.00034, respectively, and the longitude and latitude variances of raw integrated navigation were 0.0091 and 0.0047, respectively. The positioning accuracy of the UAV carrier was improved greatly. The PSO was employed to find the optimal key elements of DBSCN clustering segmentation, and accurate point cloud segmentation was obtained. At last, the fine-tuned PU-net was used for point cloud density enhancement. Compared with the raw point cloud, the density of the enhanced crop point cloud was 4 times more, and the statistical parameters showed that the enhanced crop point cloud retained the shape and contour characteristics of the raw data. In this paper, each meta-heuristic optimization algorithm had its own merits for the different problems, but we still have not found an optimization algorithm that can meet all the processing requirements. In the future, owing to the flexibility of a multi-rotor UAV in remote sensing operations in specific areas, a multi-rotor UAV equipped with a small LiDAR or optical image remote sensing system will be more suitable for remote sensing operations in the agricultural field [27,28].

Author Contributions

Conceptualization, J.C. and Z.Z.; methodology, J.C. and Z.Z.; software, Z.Z. and K.Z.; validation, Z.Z., K.Z., and S.W.; formal analysis, S.W. and Y.H.; investigation, S.W. and Y.H.; data curation, K.Z. and S.W.; writing—original draft preparation, J.C. and Z.Z.; writing—review and editing, K.Z., S.W., and Y.H.; supervision, J.C. and Y.H.; project administration, J.C. and Y.H.; funding acquisition, J.C. and Y.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (Grant No. 51979275), by the National Key R&D Program of China (Grant Nos. 2017YFD0701003 from 2017YFD0701000, 2018YFD0700603 from 2018YFD0700600, and 2016YFD0200702 from 2016YFD0200700), by the Jilin Province Key R&D Plan Project (Grant Nos. 20180201036SF), by the Open Fund of Synergistic Innovation Center of Jiangsu Modern Agricultural Equipment and Technology, Jiangsu University (Grant No. 4091600015), by the Open Research Fund of State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University (Grant No. 19R06), by the Open Research Project of the State Key Laboratory of Industrial Control Technology, Zhejiang University (Grant No. ICT20021), and by the Chinese Universities Scientific Fund (Grant Nos. 2020TC033 and 10710301).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. McManamon, P.F. Review of LiDAR: A historic, yet emerging, sensor technology with rich phenomenology. Opt. Eng. 2012, 51, 060901. [Google Scholar] [CrossRef]
  2. Vázquez-Arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D Imaging Systems for Agricultural Applications—A Review. Sensors 2016, 16, 618. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Lovell, J.L.; Jupp, D.L.B.; Newnham, G.J. Measuring tree stem diameters using intensity profiles from ground-based scanning LiDAR from a fixed viewpoint. ISPRS J. Photogramm. Remote Sens. 2011, 66, 46–55. [Google Scholar] [CrossRef]
  4. Strahler, A.H.; Jupp, D.L.B.; Woodcock, C.E. Retrieval of forest structural parameters using a ground-based LiDAR instrument (Echidna). Can. J. Remote Sens. 2008, 34, 426–440. [Google Scholar] [CrossRef] [Green Version]
  5. Zheng, G.; Moskal, L.M.; Kim, S.H. Retrieval of effective leaf area index in heterogeneous forests with terrestrial laser scanning. IEEE Trans. Geosci. Remote Sens. 2013, 51, 777–786. [Google Scholar] [CrossRef]
  6. Lai, X.D.; Liu, Y.S.; Li, Y.X. Application status and development of point cloud density characteristics of airborne LiDAR. Geospat. Inf. 2018, 16, 1–5. [Google Scholar]
  7. Quan, W.; Gong, X.; Fang, J.; Li, J. Prospects of INS/CNS/GNSS Integrated Navigation Technology. In INS/CNS/GNSS Integrated Navigation Technology; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  8. Fu, B.; Liu, J.; Wang, Q. Multi-sensor integrated navigation system for ships based on adaptive Kalman filter. In Proceedings of the 2019 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 4–9 August 2019. [Google Scholar]
  9. Gao, Y.; Liu, S.; Atia, M.M.; Noureldin, A. INS/GPS/LiDAR integrated navigation system for urban and indoor environments using hybrid scan matching algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Shabani, M.; Gholami, A.; Davari, N. Asynchronous direct Kalman filtering approach for underwater integrated navigation system. Nonlinear Dyn. 2015, 80, 71–85. [Google Scholar] [CrossRef]
  11. Zhang, G.; Hsu, L.T. Intelligent GNSS/INS integrated navigation system for a commercial UAV flight control system. Aerosp. Sci. Technol. 2018, 80, 368–380. [Google Scholar] [CrossRef]
  12. Yu, H.; Han, H.; Wang, J.; Xiao, H.; Wang, C. Single-frequency GPS/BDS RTK and INS ambiguity resolution and positioning performance enhanced with positional polynomial fitting constraint. Remote Sens. 2020, 12, 2374. [Google Scholar] [CrossRef]
  13. Norouz, M.; Ebrahimi, M.; Arbabmir, M. Modified Unscented Kalman Filter for improving the integrated navigation system. In Proceedings of the 25th Iranian Conference on Electrical Engineering, Tehran, Iran, 2–4 May 2017. [Google Scholar]
  14. Goldberg, D. Genetic Algorithms in Search, Optimization and Machine Learing; Addison-Wesley: Boston, MA, USA, 1989. [Google Scholar]
  15. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the 6th International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
  16. Colorni, A.; Dorigo, M.; Maniezzo, V. Distributed optimization by ant colonies. In Proceedings of the 1st European Conference on Artificial Life, Paris, France, 11–13 December 1991; pp. 134–142. [Google Scholar]
  17. Yang, X. A new metaheuristic bat-inspired algorithm. In Proceedings of Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  18. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  19. Feng, H.; Ni, H.; Zhao, R.; Zhu, X. An enhanced grasshopper optimization algorithm to the Bin packing problem. J. Control Sci. Eng. 2020, 2020, 3894987. [Google Scholar] [CrossRef] [Green Version]
  20. Chen, Y.; Zhou, L.; Bouguila, N.; Wang, C.; Chen, Y.; Du, J. BLOCK-DBSCAN: Fast clustering for large scale data. Pattern Recognit. 2020, 109, 107624. [Google Scholar] [CrossRef]
  21. Kedzierski, M.; Fryskowska, A. Methods of laser scanning point clouds integration in precise 3D building modelling. Measurement 2015, 74, 221–232. [Google Scholar] [CrossRef]
  22. Rupink, B.; Mongus, D.; Zalik, B. Point density evaluation of airborne LiDAR datasets. J. Univers. Comput. Sci. 2015, 21, 587–603. [Google Scholar]
  23. Huang, H.; Shi, W.U.; Gong, M. Edge-aware point set resampling. ACM Trans. Graph. 2013, 32, 1–12. [Google Scholar] [CrossRef]
  24. Mafarja, I.A. Evolutionary population dynamics and grasshopper optimization approach for feature selection problems. Knowl. Based Syst. 2018, 145, 25–45. [Google Scholar] [CrossRef] [Green Version]
  25. Ledig, C.; Theis, L.; Huszar, F. Photo-realistic single image super-resolution using generative adversarial network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
  26. Yu, L.; Li, X.; Fu, C.W. PU-net: Point cloud upsampling network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 19–21 June 2018. [Google Scholar]
  27. Zhang, Z.; Han, Y.; Chen, J.; Wang, S.; Wang, G.; Du, N. Information extraction of ecological canal system based on remote sensing data of unmanned aerial vehicle. J. Drain. Irrig. Mach. Eng. 2018, 36, 1006–1011. [Google Scholar]
  28. Wang, S.; Han, Y.; Chen, J.; Pan, Y.; Cao, Y.; Meng, H. Weed classification of remote sensing ecological irrigation area by UAV based on deep learning. J. Drain. Irrig. Mach. Eng. 2018, 11, 1137–1141. [Google Scholar]
Figure 1. Framework of the proposed method. There are 3 steps: Integrated navigation algorithm based on grasshopper optimization algorithm (GOA), Unmanned Aerial Vehicle (UAV)-borne Light Detection and Ranging (LiDAR) system construction and point cloud data processing, and point cloud density enhancement based on the Point Cloud Up-Sampling Network of the whole work of this paper.
Figure 1. Framework of the proposed method. There are 3 steps: Integrated navigation algorithm based on grasshopper optimization algorithm (GOA), Unmanned Aerial Vehicle (UAV)-borne Light Detection and Ranging (LiDAR) system construction and point cloud data processing, and point cloud density enhancement based on the Point Cloud Up-Sampling Network of the whole work of this paper.
Remotesensing 12 03208 g001
Figure 2. Framework of global positioning system (GPS)/ inertial navigation system (INS) integrated navigation.
Figure 2. Framework of global positioning system (GPS)/ inertial navigation system (INS) integrated navigation.
Remotesensing 12 03208 g002
Figure 3. Interaction among grasshoppers.
Figure 3. Interaction among grasshoppers.
Remotesensing 12 03208 g003
Figure 4. The pipelines of Kalman filter optimization.
Figure 4. The pipelines of Kalman filter optimization.
Remotesensing 12 03208 g004
Figure 5. Top view of UAV-borne LiDAR scanning system.
Figure 5. Top view of UAV-borne LiDAR scanning system.
Remotesensing 12 03208 g005
Figure 6. Side view of UAV-borne LiDAR system.
Figure 6. Side view of UAV-borne LiDAR system.
Remotesensing 12 03208 g006
Figure 7. Flow chart of particle swarm optimization (PSO) algorithm.
Figure 7. Flow chart of particle swarm optimization (PSO) algorithm.
Remotesensing 12 03208 g007
Figure 8. Longitude error of GOA-based integrated navigation.
Figure 8. Longitude error of GOA-based integrated navigation.
Remotesensing 12 03208 g008
Figure 9. Latitude error of GOA-based integrated navigation.
Figure 9. Latitude error of GOA-based integrated navigation.
Remotesensing 12 03208 g009
Figure 10. Longitude error of PSO-based integrated navigation.
Figure 10. Longitude error of PSO-based integrated navigation.
Remotesensing 12 03208 g010
Figure 11. Latitude error of PSO-based integrated navigation.
Figure 11. Latitude error of PSO-based integrated navigation.
Remotesensing 12 03208 g011
Figure 12. Crop point cloud raw data collection environment.
Figure 12. Crop point cloud raw data collection environment.
Remotesensing 12 03208 g012
Figure 13. Iteration curve of fitness of the meta-heuristic-based density-based spatial clustering of applications with noise (DBSCAN) algorithms.
Figure 13. Iteration curve of fitness of the meta-heuristic-based density-based spatial clustering of applications with noise (DBSCAN) algorithms.
Remotesensing 12 03208 g013
Figure 14. Clustering segmentation result of crop and ground. (a) K-means clustering; (b) BA-based DBSCAN (Eps = 0.2145, MinPts = 18); (c) GA-based DBSCAN (Eps = 0.1006, MinPts = 8); (d) PSO-based DBSCAN (Eps = 0.8147, MinPts = 13); (e) GOA-based DBSCAN (Eps = 0.8147, MinPts = 13); (f) 3D reconstruction of crop field point cloud.
Figure 14. Clustering segmentation result of crop and ground. (a) K-means clustering; (b) BA-based DBSCAN (Eps = 0.2145, MinPts = 18); (c) GA-based DBSCAN (Eps = 0.1006, MinPts = 8); (d) PSO-based DBSCAN (Eps = 0.8147, MinPts = 13); (e) GOA-based DBSCAN (Eps = 0.8147, MinPts = 13); (f) 3D reconstruction of crop field point cloud.
Remotesensing 12 03208 g014aRemotesensing 12 03208 g014b
Figure 15. Visualization of raw crop point cloud data.
Figure 15. Visualization of raw crop point cloud data.
Remotesensing 12 03208 g015
Figure 16. 3D reconstruction of crop point cloud with and without density enhancement. (a) The raw crop point cloud; (b) The crop point cloud with density enhancement.
Figure 16. 3D reconstruction of crop point cloud with and without density enhancement. (a) The raw crop point cloud; (b) The crop point cloud with density enhancement.
Remotesensing 12 03208 g016
Table 1. Error of integrated navigation with different enhancement methods.
Table 1. Error of integrated navigation with different enhancement methods.
Longitude L Variance Error Latitude λ Variance Error
GOA-based integrated navigation0.000460.00034
PSO-based integrated navigation0.000870.00079
Raw integrated navigation0.00910.0047
Table 2. The statistical parameters of the enhanced point cloud and the raw data.
Table 2. The statistical parameters of the enhanced point cloud and the raw data.
Mean ValueVarianceStandard Deviation
Raw crop point cloud558.64171.6013.10
1st set of the enhanced crop point cloud568.84298.4817.27
2nd set of the enhanced crop point cloud578.68289.3417.01
3rd set of the enhanced crop point cloud554.32278.5616.69
4th set of the enhanced crop point cloud565.44325.0818.03
5th set of the enhanced crop point cloud563.56317.5117.82

Share and Cite

MDPI and ACS Style

Chen, J.; Zhang, Z.; Zhang, K.; Wang, S.; Han, Y. UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network. Remote Sens. 2020, 12, 3208. https://doi.org/10.3390/rs12193208

AMA Style

Chen J, Zhang Z, Zhang K, Wang S, Han Y. UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network. Remote Sensing. 2020; 12(19):3208. https://doi.org/10.3390/rs12193208

Chicago/Turabian Style

Chen, Jian, Zichao Zhang, Kai Zhang, Shubo Wang, and Yu Han. 2020. "UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network" Remote Sensing 12, no. 19: 3208. https://doi.org/10.3390/rs12193208

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop