Next Article in Journal
Special Issue “Extreme Sciences and Engineering II”
Previous Article in Journal
An Earthquake Early Warning Method Based on Bayesian Inference
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrating a LiDAR Sensor in a UAV Platform to Obtain a Georeferenced Point Cloud

by
Alexandre Almeida Del Savio
*,
Ana Luna Torres
,
Miguel Angel Chicchón Apaza
,
Mónica Alejandra Vergara Olivera
,
Sara Rocío Llimpe Rojas
,
Gianella Tania Urday Ibarra
,
José Luis Reyes Ñique
and
Rolando Issac Macedo Arevalo
Scientific Research Institute, Universidad de Lima, Lima 15023, Peru
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(24), 12838; https://doi.org/10.3390/app122412838
Submission received: 7 November 2022 / Revised: 2 December 2022 / Accepted: 10 December 2022 / Published: 14 December 2022
(This article belongs to the Section Civil Engineering)

Abstract

:
The combination of light detection and ranging (LiDAR) sensors and unmanned aerial vehicle (UAV) platforms have garnered considerable interest in recent years because of the wide range of applications performed through the generation of point clouds, such as surveying, building layouts and infrastructure inspection. The attributed benefits include a shorter execution time and higher accuracy when surveying and georeferencing infrastructure and building projects. This study seeks to develop, integrate and use a LiDAR sensor system implemented in a UAV to collect topography data and propose a procedure for obtaining a georeferenced point cloud that can be configured according to the user’s needs. A structure was designed and built to mount the LiDAR system components to the UAV. Survey tests were performed to determine the system’s accuracy. An open-source ROS package was used to acquire data and generate point clouds. The results were compared against a photogrammetric survey, denoting a mean squared error of 17.1 cm in survey measurement reliability and 76.6 cm in georeferencing reliability. Therefore, the developed system can be used to reconstruct extensive topographic environments and large-scale infrastructure in which a presentation scale of 1/2000 or more is required, due to the accuracy obtained in the work presented.

1. Introduction

Unmanned aerial vehicles (UAVs) were initially developed for military missions due to their large-scale mapping capabilities and ability to reach remote locations that people were not authorized to access [1]. In addition, these devices were also commonly used to develop DEM (Digital Elevation Models) [2] because they were more affordable than conventional methods [3], such as field surveying using a total scanning station [4], and they presented few take-off and landing requirements [5].
The new UAV models possess increased payload capacities, thus supporting technological enhancements such as sensors and cameras. Furthermore, UAVs have expanded their application possibilities, as they can now be used for infrastructure inspections, atmospheric research, fishing ground identification, environmental control, risk, natural disaster management, geological mining explorations, topography surveys, volumetric calculations [5], archaeological studies, plant monitoring [6] and crop biomass forecasting [7].
Among the available technologies that can be added to UAV systems are light detection and ranging (LiDAR) sensors. First introduced in the mid-1990s, these sensors produce a spatially accurate point cloud and are commonly used for mapping areas with high vegetation density, building segmentation and forest biomass inventories [8]. When implemented in a UAV, their data capture range is limited to the flight area, but they allow maximum coverage of the targeted area during each flight [9]. Even if the area is not very accessible, data collection times remain efficient against the times reported by conventional ground sensors, such as scanner tripods [10]. Therefore, trajectory reconstruction performance is essential for accurate LiDAR point cloud georeferencing based on UAV flights [11].
A previous study used a UAV to capture high-resolution images to generate surveying data throughout China’s Altyn Tagh fault line. The resulting DEM evidenced a resolution of 0.065 m and an orthophoto of 0.016 m. Based on the data collected from the UAV, the authors measured recent seismic displacement at 7 ± 1 m, supplemented with satellite images. In total, and due to multiple earthquakes, aggregated displacements of 15 ± 2 m, 20 ± 2 m and 30 ± 2 m were measured [12].
LiDAR sensors have a wide variety of applications, such as using an adaptive thresholding algorithm to segment the LiDAR point cloud, which is then followed by a selection process [13]; collecting multi-temporal LiDAR data over 6 years to obtain annual growth rates and gains and losses in carbon storage in human-modified tropical forests [14] or assisting in the identification of vegetation that threatens the structural integrity of a given area under a Sustainable Conversation Plan [15].
Another study assessed the performance of a UAV equipped with a LiDAR by test target point detection comparisons, obtaining a mean square error of ±10 cm without adjustment points and ±4 cm using adjustment control points compared against a previous photographic 3D reconstruction [16].
In a comprehensive bias impact assessment for UAV-based LiDAR systems, an iterative calibration strategy was proposed to derive system parameters using different geometric characteristics, reducing the errors produced by mounting the LiDAR on the UAV and generating an accuracy of approximately 2 cm [17].
One of the more essential advantages authors find is the opportunity to customize the LiDAR system. The integration of a commercial UAV system with an Odroid minicomputer and the ROS system was used to perform flight missions with a low-weight platform that processes data obtained in real time without using a ground station [18]. A study for the 3D survey of forest ecosystems in China developed a low-cost LiDAR sensor system implemented in a UAV, where a multirotor UAV (eight rotors) and a Velodyne Puck VLP-16 were used as part of the hardware. Part of the development of the system was to consider the replacement of the LiDAR sensor by future models of smaller size and weight in the structure design. In addition, two types of software were developed to control the LiDAR-UAV system and processing of the LiDAR data. This low-cost system allowed obtaining the topographic and vegetation parameters necessary for biodiversity studies by obtaining high-density LiDAR data [19].
Among the fields that are adopting the use of LiDAR-UAV systems to improve their operations are agriculture [7,20,21], forestry sciences [19,22], archaeology [23,24], topography [8,11,19], bathymetry [25], building reconstruction [1,26] and structural inspection [10,27].
An UAV system equipped with a LiDAR scanner and a multi-spectral camera system was presented for the study of pre-Columbian Amazonian archaeology. The information collected by the LiDAR sensor has an 80% lateral overlap with six flight lines, three horizontal and three vertical. The result was the study of the structure of the forest and the reconstruction of surfaces under the forests for the identification of archaeological sites [23].
For identifying, mapping and interpreting mines from ancient Rome in northwestern Spain, integrated geometric applications based on information from a UAV system equipped with a LiDAR sensor have been reported. The 1 m resolution LiDAR sensor improved channel, water reservoir and mine resolution and survey capabilities. Furthermore, using a UAV system implemented with LiDAR systems for short and detailed surveys reduced the time of data collection and tool costs [24].
A fusion between a UAV system with a LiDAR sensor and spectral imaging was shown to identify plant species and 3D characterization at sub-meter scales. The resulting precision was between 84% and 89% for species classification, and the proposed system’s digital elevation model (DEM) correlated with the DEM of a LiDAR-derived system, with an R2 of 0.98 and 0.96 [22].
Integrating LiDAR sensors and UAV platforms presented benefits in agriculture in observing the production and status of crops. A system was developed using a DJI Matrice 100 UAV, an Odroid XU4, the Velodyne VLP-16 LiDAR sensor, a Point Gray Chamaleon3 3.2 MP Color camera, and a Sony imx265 sensor as part of the hardware and system. The ROS system was used to record the data. A variation in the height of the crops was found between 0.35 and 0.58 m. The components used and the mount design were made available online to be adapted to similar projects. The mount was printed in 3D nylon for this project and was designed considering the payload and the pinholes already included in the LiDAR sensor and inertial measurement unit (IMU) sensor [20].
The use of LiDAR-UAV systems in civil engineering can have many applications. For example, Bolourian et al. (2020) proposed a flight planning method for the benefit of UAVs in bridge inspection, considering a Matrice 100 UAV and a Velodyne LiDAR PUCK for the test [27]. In [10], a platform for structural inspection was developed, integrating a UAV and a 2D LiDAR scanner. Similarly, [26] used a low-cost IMU and the Hokuyo UTM-30LX sensor integrated this time into the UAV by an aluminum frame. Finally, Chiang et al. (2017) [1] proposed a LiDAR-UAV system to reconstruct urban environments, combining the UAV with an inertial navigation system (INS), a GNSS receiver, and a low-cost Velodyne VLP-16 LiDAR sensor.
There has been increasing use of LiDAR sensors in various areas during the last years, mainly due to technological accessibility and diffusion worldwide. Other point cloud generation techniques, like photogrammetry, are limited because they depend on parameters associated with images like luminosity, meteorological conditions, and others [28]. The LiDAR sensor does not have this problem and can obtain information about places with difficult accessibility and restricted vision [29].
As a result, several companies have developed systems integrated with LiDAR for commercial use. However, these systems can be costly, with closed systems limiting users’ possibilities to customize them. The configurations are already established by the manufacturer and may have limited options to be specialized for users’ needs.
Therefore, this study seeks to develop, integrate and use a LiDAR sensor system implemented in a UAV to provide an open system that can be configured according to the user’s needs. This research also includes the design and building of an alternative structure for the LiDAR support system, utilizing accessible materials and equipment. An interface between a LiDAR sensor and UAV was created through an integrated system to obtain a georeferenced point cloud, providing an alternative to collecting topographic information through a low-cost system compared to those currently in the market.

2. Materials and Methods

This section discusses the materials used to implement the developed system as part of this study. The system is configured using an Odroid XU4 Mini Computer connected to a Wi-Fi network using a Wi-Fi/USB module. The system is powered using a 2200 mAh/11.1 V LiPo battery connected directly to the VLP-16 sensor and a UBEC 5V regulator connected to the Mini Computer components. Below is a graphic of the integration scheme with the connections between the different parts (Figure 1), and specifications of the main equipment are outlined in Section 2.1.
The LiDAR sensor chosen is the Class 1 Puck Lite LiDAR VLP-16 LiDAR Laser Sensor with a 100 m range that generates up to 600,000 points/s through a 360° horizontal field of view and a 30° vertical field of view [30]. The GARMIN 18× LVC GNSS receiver is used for data synchronization. The LiDAR was integrated into the UAV platform Matrice 600 Pro [31], a hexacopter approximately 1.5 m in diameter with a maximum take-off weight of 15.5 kg. Six 3-cell LiPo batteries power the drone with a capacity of 4.5 Ah each. These batteries allow for approximately 15 min of flight time based on the configuration of the proposed LiDAR system, which carries a payload of 1300 g. Finally, an Odroid XU4 [32] Mini Computer with Linux operating system is used to collect the information from the VLP-16 through a direct Ethernet connection. Position and orientation information is obtained from the UAV Flight Control System and sent to the Minicomputer through a serial/USB converter module using a ftdi232 chip. More information about the equipment is presented in Table 1.
To validate the obtained point cloud, we surveyed six reference control points using a total station [33]. In addition, a GNSS (Global Navigation Satellite System) [34] was used to survey two additional control points to correct point coordinates for increased accuracy. Finally, the points were placed in a georeferenced orthophoto previously obtained using a DJI Phantom 4 Pro UAV [35] for area recognition and comparison.
In addition, this section also explains the application procedure used to generate the final point cloud with this system. The flow presented in Figure 2 is divided into four main stages: Data Collection, Data Processing, Validation and Final Results. Each step is described below in its corresponding section.

2.1. LiDAR Sensor Support Structure

2.1.1. Structure Design

For mounting the LiDAR system components on the M600 Pro UAV, we designed several parts that could be attached to this UAV model. A maximum payload weight of 6 kg was considered.
The structure was designed based on the LiDAR system measurements: a diameter of 103 mm and a height of 72 mm. To maximize light beam ranges when surveying, the LiDAR sensor was placed vertically in the UAV [36]. In addition, a compartment was required for other system components that had to be loaded onto the UAV, such as the Odroid XU4 Mini Computer and the power and connection cables. The weight of the LiDAR sensor and its components is approximately 1.5 kg, allowing for greater capacity for the other components. Figure 3 below provides detailed drawings for the component modeled in 3D using the Solidworks software [37].

2.1.2. Structure Construction

Table 2 compares the PLA (Polylactic Acid) and the ABS (Acrylonitrile Butadiene Styrene) materials for building the structure. We selected PLA (Figure 4b) because it features a lower melting point, does not produce intense fumes, and the material is biodegradable, which is an advantage when creating a prototype. The printer used was a Makerbot Replicator Plus (Figure 4a). Figure 4c illustrates the final piece printed using the PLA material.

2.1.3. Structure Assembly

The Matrice 600 Pro has a built-in anchor point that can be used to integrate additional sensors and cameras. Figure 5 below indicates the components used.
The LiDAR sensor, the LiPo battery, the Odroid XU4 Mini Computer, and the power and communication cable were all arranged in the support structure depicted in Figure 5b. The printed part was assembled on the UAV Matrice 600 Pro rail. We added rails and 2″ screws to secure the structure. The LiDAR support structure utilized 2 1/2″ screws.
Then, we connected the UAV Matrice 600 Pro to the XU4 Mini Computer, and the Mini Computer was connected to the Wi-Fi network through a USB Wi-Fi Adapter, as denoted in Figure 5c. Figure 5d illustrates the coupling of the Garmin GPS with its support piece on top of the UAV.

2.2. Data Collection and Processing Procedure Development

The data collection procedure used in the field and the subsequent processing and post-processing process are specified below.
The data acquisition process is based on a method that facilitates LiDAR recordings in an experimental winter wheat field [38]. Here, LiDAR point clouds are registered, mapped and assessed using the functionalities of the robot operating system (ROS) and the Point Cloud Library (PCL) to estimate crop volumes.
The georeferencing equation [12] was used as a foundation for data processing.

2.2.1. Study Area

When carrying out the flight, the area where the mapping would be carried out had to be defined, and the following variables were considered: number of flight lines, flight altitude, flight time and flight speed. These parameters were entered into the flight planning software that programs the automatic route of the UAV.

2.2.2. Pre-Flight

Using the total station, UAV waypoints were measured once per area. Figure 6 below lists the variables obtained by the total station. The variables used were:
  • Lever Coordinates: Distance from IMU to the center of the LiDAR in the UAV Reference System. It is on the XYZ axis.
  • Distance from the center of IMU to the floor.
  • Distance from the center of the LiDAR to the floor.
The GPS time was synchronized with the LiDAR time, so the common data between both tools is time.
A point cloud was generated from photos of a Phantom 4 Pro UAV for georeferencing. The control points were generated by the total station from the positions provided by the Emlid GPS RTK module.

2.2.3. During the Flight

The data collection stage was conducted using an Odroid XU4 Mini Computer with the Ubuntu 1404 LTS distribution of the Linux operating system [32]. Hardware communications are established via an Indigo-based ROS, since this framework runs on a Linux operating system [38].
The ROS is a framework of processes (also known as nodes) developed by Willow Garage and commonly used as a meta-operational system for robots, providing services such as hardware abstraction, low-level device control, message transmission between processes, packet management, etc. Its architecture is mainly based on the publish-subscribe network pattern, in which multiple nodes subscribe to a master node that manages communication between them. Users can establish a distributed network of nodes that communicate through messages that travel on buses known as topics [26].
Here, an open-source ROS package was used to acquire data and generate point clouds from the UDP Ethernet packages provided by the VLP-16 [36]. This ROS package contains the velodyne_driver package, which offers essential sensor management via the velodyne_node ROS node, capturing LiDAR data and publishing them in raw form velodyne_msgs/velodyneScan messages in the velodyne_packets topic. Then, execution is conducted using the vdump command, which saves raw data in pcap format.
Figure 7 below illustrates the communications scheme for acquiring LiDAR system data and position and orientation data using the DJI_SDK ROS package, which provides a ROS interface to the DJI-embedded SDK. This allows us to control the M600 Pro platform through ROS messages and services. This package is based on the main DJI SDK library in C++ [33]. During ROS node execution, dji_sdk publishes orientation data as geometry_msgs/QuaternionStamped messages in the/dji_sdk/attitude topic and the position data as sensor_msgs/NavSatFix messages in the/dji_sdk/gps_position topic. These data are recorded using the rosbag command and are later converted to csv format.
After feeding and starting all system components, the system must interface with the Mini Computer via the SSH protocol using the PuTTY software and activate the VNC graphical display connection to start collecting data. To guarantee synchronization, the system clock must be updated through NTP services. In addition, the VLP-16 webserver page must be accessed to check whether the GPS PPS option is Locked. Next, the ROS commands must be executed to start storing data and perform the scheduled flight. Finally, an FTP connection must retrieve the pcap and csv files and start the georeferenced point cloud generation stage.

2.2.4. Generation of Georeferenced Point Cloud

The generation of the georeferenced Point Cloud is the transformation of the local coordinates from the LiDAR sensor (s-frame) to global coordinates (g-frame) [33]. The local coordinates of the point acquired at a given t moment are georeferenced according to the model proposed in Equation (1) below [21].
p t g = P t g + R b g t · R s b · p t s + l b
Equation (1) represents the Georeferencing Equation. This equation is used to make an indirect measurement. With this type of measurement, the center of the UAV cannot be accessed. Therefore the resulting measurements would not be correct, since they are not relative to its center (0, 0, 0). The errors presented by the first equation are corrected by comparing two point clouds: a real cloud, which uses the points provided by the total station, and an artificial cloud, constructed using the measurements from the UAV.
p t g = P t g + R b g t · R c a l · R s b · p t s + a c a l + l b
The initial proposed model is modified by adding two variables that correct measurement errors. Equation (2) denotes the proposed georeferencing equation based on Equation (1) but adds two additional variables to correct measurement errors: R_cal and a_cal.
Where
1.
P t g : Three-dimensional vector in the g-frame. This position information is delivered by the UAV’s GNSS/INS system in latitude, longitude and altitude on the reference WGS 84 ellipsoid published at 50 Hz.
2.
R b g t : Rotation matrix that transforms a vector from the UAV’s frame of reference, the b-frame, to the g-frame. This information is obtained from the IMU of the UAV that delivers the vehicle attitude represented as a quaternion for the FLU (Forward, Left, Up) body frame rotation to the global ENU (East, North, Up) frame published at 100 Hz.
3.
R s b : Rotation matrix from the s frame, LiDAR sensor, to the b-frame, UAV, which depends on the mounting base.
4.
p t g : Points provided by the LiDAR sensor.
5.
R c a l and a c a l : Matrix and calibration vector that corrects errors caused by assembly and shaft misalignment issues.
6.
l b : Lever arm displacement between the LiDAR sensor and the body frame origins defined within the b body frame.
These variables are divided between static and dynamic (Table 3). Static variables do not change during application of the Equation, while dynamic variables change according to the point used within the point cloud.
Figure 8 includes a diagram that denotes the location of the reference frames used. For example, in this Figure, the b-frame (related to GNSS/INS), the s-frame (related to LiDAR), and the g-frame (ENU) may be observed.
The data obtained from the LiDAR sensor are expressed in the s reference system. However, the rotation matrix is applied since these data must be expressed in the b reference system [31].
To obtain the R s b rotation matrix, basic sequential rotations around the ZXZ axes are used as per Equation (3).
R s b = R z π 2 R x π 2 R z π 3 + 0.135
The 0.135, 2, and 2 values added and divided by the 180° represented by π are mathematically obtained by the theory of rotations and the 3D printed adapter. For the z_angle, the 0.135 value was adjusted manually by overlapping with a reference plane RP located below the UAV.
Mathematically, three successive rotations are again performed around the Z, X, and Z axes.
Figure 9 denotes the raw point cloud PC without applying the rotation matrix. This point cloud contains RP and is in the LiDAR reference system.
Figure 10 denotes the raw point cloud RP expressed in the reference system of the UAV. This cloud results from multiplying the R s b rotation matrix and adding the lb displacement to each point cloud element in the LiDAR reference system.
With the data in the same reference system, i.e., when the rotation matrix has been resolved, the point cloud calibration process can begin.
Figure 11 shows the initial parameters, alignment parameters, and calibration equation used for point cloud calibration. The parameters are detailed below.
Initial and alignment parameters:
  • R_bs2bi: Rotation Matrix. Converts LiDAR data from LiDAR-centric reference system to the UAV-centric reference system, IMU.
  • A_bi: Lever Coordinates. Data from the total station.
  • Alignment Parameters:
  • R_al and a_al: Data provided by the CloudCompare software after overlapping the point cloud from the UAV and artificial point cloud from the total station. The artificial point cloud is created with Python based on the measurement from the IMU center of the UAV to the ground. The area generated is proportional to the ground area generated by the LiDAR. This process is executed only once.
Table 4 lists the equivalences between the calibration equation and the parameters from the georeferencing equation for the point cloud calibration process.
Based on static values, the point cloud covers the entire route, including UAV take-off and landing.
The pcap files for each segment are stored in their corresponding folders. There can be multiple pcap files per segment.
Figure 12 denotes the calibration results from the point cloud PC calibrated using static variables.
The program developed for the point cloud georeferencing process uses the following Python packages: Panda, NumPy, PyMap3D and Lapsy. The loaded parameters are provided in Figure 11, which includes initial and alignment parameters.
The folder path is where the corresponding pcap files for each segment are located.
The dynamic values of the georeferencing equation are r_bi2enu and x_gps_enu. R_bi2enu provides IMU orientation, and x_gps_enu provides GPS positions. Therefore, both values have to be interpolated by the data differences exhibited between the IMU and the GPS over a period of time.
Algorithm 1 denotes the development of the georeferenced point cloud for each path.
Algorithm 1: Generation of Georeferenced Point Cloud for Each Path
Input: r_s: point cloud matrix within the s-frame,
t_s: vector indicating the time associated with each r_s point.
euler_bi: Euler angle value matrix (b-frame → i-frame),
t_euler: vector indicating the time associated with each euler_bi point.
p_enu: UAV position matrix within the i-frame (ENU),
t_enu: vector indicating the time associated with each p_enu point.
Output: r_utm: matrix containing the values of the georeferenced point cloud.
Obtain the time intervals corresponding to the desired path.
For each of the r_s points:
Multiply r_s(i) by the calibration parameters and obtain r_b(i).
By interpolation in euler_bi, obtain the orientation value corresponding to each r_s (i) and store it in euler_i.
Obtain R_i, rotation matrix corresponding to euler_i.
Calculate: r_i(i) = R_i * r_b(i).
By interpolation in p_enu, obtain p_enu_i, the position value corresponding to each r_s (i).
Calculate: r_enu(i) = r_i(i) + p_enu_i
End of For
Convert r_enu to the UTM reference system to obtain r_utm.
Generate the file from r_utm.
Table 5 lists the equivalences between the calibration equation and the parameters from the georeferencing equation for developing Algorithm 1.
Figure 13 denotes the final point cloud results after adding the dynamic values from the georeferencing equation. Finally, the point cloud is georeferenced in the UAV reference system.
  • R_bs_0: Time vector. Its values are adapted to the parameters from the point cloud within the X, Y and Z parameters.
  • X_enu: Vector generated by concatenation. Here, concatenation is initialized, and a copy is generated.
Figure 14 below illustrates the conversion of the vectors generated by concatenation to the UTM reference system.
X_enu is converted to the geodetic system (latitude, longitude) from the ENU system. The height is based on the ellipsoid when converting X_enu from the geodetic system to UTM.

2.2.5. Post-Processing

Once the point cloud is obtained, the noise filter tool of the CloudCompare software is used. CloudCompare is an open-source project with many advanced techniques for point cloud registration and resampling. Python scripts are employed for georeferencing the point cloud and creating files in *.las format. The process consists in using the Clean tool and the Noise filter option. In the window tool, the Radius option is selected with a value of 0.03 for Neighbors, and for Max error, the relative option is chosen with a 1.00 value. The Remove isolated points option is also selected [34]. This process uses the tool to clean out the outliers, using an algorithm that adjusts planes locally (around each point in the cloud) and then eliminates the point if it is too far from the adjusted plane. To record and align the clouds generated based on the different trajectories, the alignment tool was used by selecting pair sets of points, through which two entities may be aligned by selecting at least three matching point pairs in both entities. Finally, the fusion tool generated a point cloud from all the strips. The filtering process allowed a more defined point cloud of the area, eliminating around 25% of points not calibrated with the point cloud, as shown in Figure 15. The georeferencing proposal is constrained by the quality of the measurements of the GNSS/INS sensors. The unique semi-automatic part of the georeferencing process is the location of each parallel path’s start and end times. The extrapolation of the method in other study cases is associated with using correctly georeferenced control points.

3. Results

A flight plan was prepared with five flight lines over the gable roof of a hangar, which was used as a reference to compare the point cloud from the photogrammetric survey against the survey generated by the total station. This area was used because the roof of this building has a simple sloping geometry, which allows it to be used as a reference plane for the fusion of point clouds generated during the flight.
The flight plan was configured using the DJI Ground Station Pro software [35] with the following parameters:
Figure 16 indicates the flight plan, wherein the UAV travels across five flight lines. In these flight lines, information is collected via the LiDAR sensor. Hence, a different point cloud of the area is obtained for each traveled line, thus generating a total of five point clouds.
From the data obtained from the UAV positions provided by the GPS geodetic system, we obtained the path followed in the UTM system.
Figure 17 is the result of the georeferencing equation in the UTM Reference System after the conversion performed in Figure 13. The X-axis is the equator, and the Y-axis is the meridian. The complete path followed by the UAV is expressed in UTM coordinates for X, Y, and WGS84 for Z.
The final point cloud is the result of merging the five different point clouds, through which better point accuracy and density were obtained, as shown in Figure 18.

Procedure Validation

In Figure 19, the control points used for validating the procedure are indicated in orthophoto (Figure 19a) and plan (Figure 19b) formats.
Table 6 lists the coordinates obtained using the different survey methods (total station, LiDAR and photogrammetry). In contrast, Table 7 lists the coordinate comparison errors between the reference control points obtained with the total station and the control points obtained with the LiDAR and photogrammetry procedures. These errors mainly reflect the positional error between the GPS and the differential GPS of the UAV.
As a first result, a root mean square error (RMSE) of 35.514 m was obtained from georeferencing (Table 7). In the sequence, as part of the process, an offset for the georeferencing errors from the survey is added into the workflow, in which the LiDAR point cloud is superimposed on the previously defined coordinates from the total station. For this application case, the translation values were 1.757 m along the X-axis and 1.867 m along the Y-axis at point T4. In addition, two reference points are located for the survey overlay. In this example, the turn was 55°, as shown in Figure 20.
In Figure 21, the same processing was applied for the translation along the Z axis. For this case, the value was 35 m.
The results from this procedure are described in Table 8 and Table 9. The corrected control points are shown according to the section and the related equipment (Table 8). Table 9 denotes the RMSE for the LiDAR survey obtained through georeferencing, whose value is 0.766 m concerning the total station.

4. Discussion

A RMSE of 35.513 m was initially obtained when georeferencing the LiDAR point cloud (Table 7). After offsetting the errors, an RMSE of 0.766 m was achieved (Table 9). According to Table 9, the maximum errors obtained were 0.319 and 0.923 m for the X and Y axis, respectively, and the corresponding minimum errors were 0.033 and 0.000. For the Z-axis, the error ranged between 0.1160 and 0.722 m, which is within the error range for the Matrice 600, 3 m in X and Y, and 5 m in Z. Other works studying the vertical accuracy of LiDAR also found that the Z-axis presents the most errors in trials done with the system, sometimes exceeding that manufacturer’s information [36,37].
Table 10 shows the corrected control point cloud errors and Table 11 compares the values from measuring the distances between the control points with LiDAR and UAV photogrammetry and the reference points from the total station (TS) to determine the accuracy of the measurements of the lifted objects. Here, the maximum absolute error between the LiDAR and the TS was 0.347 m between points T3 and T4, and the absolute minimum error was 0.10 m between T1 and T2. On the other hand, for UAV photogrammetry, the maximum absolute error was 0.190 m between points T6 and T1, and the absolute minimum error was 0.006 m between T5 and T6. Likewise, the RMSE of the LiDAR and UAV photogrammetry was calculated against the values obtained with the TS. LiDAR presented an RMSE of 0.171 m, while photogrammetry presented an RMSE of 0.024 m.
Similarly, in the LiDAR-UAV system proposed for the reconstruction of urban environments, it was found that the system reached an accuracy of approximately 1 m compared to the results of a laser scanner (TLS) [1]. These results are in the range of the accuracy errors in the presented investigation. Additionally, the system developed to observe the production of crops [20] with software materials like the ones applied to this investigation presented a variation in the height of the crops between 0.35 and 0.58 m. The use of the UAV-LiDAR system for height estimation of different crops showed an RSME of 0.034, 0.074, and 0.12 m for wheat, sugar beet, and potato, respectively [7].
The information from both point clouds is similar, and both show pros and cons. For example, although photogrammetry reads the area with greater detail in textures, it is affected by light and shadows. On the other hand, LiDAR is not affected by weather conditions. LiDAR is also more effective than photogrammetry in areas with dense vegetation or where the surface is complex to assess visually [37] and is better suited for large areas, as, for more detailed surveys, it would be preferred to consider adding more conventional topographic techniques as a compliment [36], like the GCPs obtained with GNSS and total station used in this investigation. A comparison between this two point clouds is shown in Figure 22.
It is important to note that this investigation’s most significant advantage was the ability to customize the system according to our specifications and the equipment available. Furthermore, as other projects have shown, developing our system allows considering factors like improvement and change of the supporting structure as new technology emerges [10,19], defining the location of the sensor according to the user needs [10,19] and using low-cost equipment [1].

5. Conclusions

This paper proposes the development, integration and use of a system to generate a georeferenced point cloud based on integrating a LiDAR sensor into a UAV platform. Furthermore, we designed and built a support structure to facilitate coupling the sensor to the drone platform. Finally, the ROS packages and the Python programming language were used to process data collection.
The proposed LiDAR system captured the shape of the object studied and yielded a maximum error of 0.347 m (34.7 cm) and an RMSE of 0.171 m (17.1 cm) (Table 10). In terms of accuracy, the results revealed that the georeferenced point cloud generated by the system implemented herein evidenced an RMSE of 0.766 m (76.6 cm). Therefore, the developed system can be used to reconstruct extensive topographic environments and large-scale infrastructure, in which a presentation scale of 1/2000 or more is required due to the accuracy obtained in work presented and in line with the recommended scales for topographic work.
Future research studies may consider using high-precision GNSS/INS sensors with an error of fewer than 0.1 degrees in orientation measurements and 5 cm in position measurements to improve the accuracy of the point cloud. Path generation can also be enhanced by processing sensor position and orientation data through a particle filter to better model sensor uncertainty. This would be supported by developing a particle filter framework for solving positioning, navigation and tracking issues (the Monte Carlo sequential methods), which numerically illustrates its advantages over classical algorithms based on Kalman filters in airborne applications. This is because of the use of non-linear models and non-Gaussian noise. In addition, registration algorithms, such as Iterative Closest Point (ICP) variants, can be incorporated into the workflow to improve the adjustment and optimize the system recalibration process, thereby offsetting systematic measurement errors and leveraging the redundant information contained in the strip overlay areas that comprise the point cloud and terrain data obtained with support equipment [34]. ICP proposes solving the registration issues in point clouds by addressing the local minimum problem through direct georeferenced points and feedback bias in GNSS/INS systems [38].

Author Contributions

Conceptualization, A.A.D.S.; methodology, A.A.D.S., M.A.C.A., A.L.T.; software, M.A.C.A.; validation, M.A.V.O., A.L.T., J.L.R.Ñ.; formal analysis, M.A.C.A., M.A.V.O.; investigation, S.R.L.R., G.T.U.I., R.I.M.A.; resources, A.A.D.S., A.L.T., J.L.R.Ñ.; data curation, M.A.V.O.; writing—original draft preparation, M.A.V.O., M.A.C.A.; writing—review and editing, A.A.D.S., G.T.U.I., A.L.T.; visualization, M.A.V.O., S.R.L.R., G.T.U.I.; supervision, A.A.D.S., A.L.T.; project administration, A.A.D.S.; funding acquisition, A.A.D.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Scientific Research Institute (IDIC) of the Universidad de Lima under research project funding number PI.56.007.2018.

Data Availability Statement

The data supporting the findings of this study are available within the article.

Acknowledgments

The authors thank the Universidad de Lima for supporting this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chiang, K.; Tsai, G.; Li, Y.; El-Sheimy, N. Development of LiDAR-based UAV system for environment reconstruction. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1790–1794. [Google Scholar] [CrossRef]
  2. Uysal, M.; Toprak, A.S.; Polat, N. DEM generation with UAV Photogrammetry and accuracy analysis in Sahitlehill. Measurement 2015, 73, 539–543. [Google Scholar]
  3. Salach, A.; Bakuta, K.; Pilarska, M.; Ostrowski, W.; Gorski, K.; Kurczynski, Z. Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation. Int. J. Geo-Inf. 2018, 7, 342. [Google Scholar]
  4. Berreta, F.; Shibata, H.; Cordova, R.; de Lemos Peroni, R.; Azambuja, J.; Coimbra Leite Costa, J. Topographic modelling using UAVs compared with traditional survey methods in mining. Int. Eng. J. 2018, 71, 463–470. [Google Scholar]
  5. Zeybek, M.; Şanlıoğlu, İ. Point cloud filtering on UAV based point cloud. Measurement 2018, 133, 99–111. [Google Scholar]
  6. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 2012, 4, 1519–1543. [Google Scholar] [CrossRef] [Green Version]
  7. Ten Harkel, J.; Bartholomeus, H.; Kooistra, L. Biomass and Crop Height Estimation of Different Crops Using UAV-Based Lidar. Remote Sens. 2019, 12, 17. [Google Scholar]
  8. Lin, Y.; Hyyppa, J.; Jaakkola, A. Mini-UAV-Borne LIDAR for Fine-Scale Mapping. IEEE Geosci. Remote Sens. Lett. 2011, 8, 426–430. [Google Scholar]
  9. Lin, Y.C.; Cheng, Y.T.; Zhou, T.; Ravi, R.; Hasheminasab, S.M.; Flatt, J.E.; Troy, C.; Habib, A. Evaluation of UAV LiDAR for mapping coastal environments. Remote Sens. 2019, 11, 2893. [Google Scholar]
  10. Nasrollahi, M.; Bolourian, N.; Hammad, A. Designing LiDAR-equipped UAV Platform for Structural Inspection. Int. Symp. Autom. Robot. Constr. 2018, 35, 1–8. [Google Scholar]
  11. Jozkow, G.; Totha, C.; Grejner-Brzezinska, D. UAS topographic mapping with Velodyne LiDAR sensor. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 201–208. [Google Scholar] [CrossRef] [Green Version]
  12. Gao, M.; Xu, X.; Klinger, Y.; van der Woerd, J.; Tapponnier, P. High-resolution mapping based on an Unmanned Aerial Vehicle (UAV) to capture paleoseismic offsets along the Altyn-Tagh fault, China. Sci. Rep. 2017, 7, 8281. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Toth, C.K.; Barsi, A.; Lovas, T. Vehicle recognition from LiDAR data. Int. Arch. Photogramm. Remote Sens. 2003, 34, W13. [Google Scholar]
  14. Moura, Y.M.; Balzter, H.; Galvão, L.S.; Dalagnol, R.; Espírito-Santo, F.; Santos, E.G.; García, M.; Bispo, P.C.; Oliveira, R.C.; Shimabukuro, Y.E. Carbon Dynamics in a Human-Modified Tropical Forest: A Case Study Using Multi-Temporal LiDAR Data. Remote Sens. 2020, 12, 430. [Google Scholar] [CrossRef] [Green Version]
  15. Comer, D.C.; Comer, J.A.; Dumitru, I.A.; Ayres, W.S.; Levin, M.J.; Seikel, K.A.; White, A.D.; Harrower, M.J. Airborne LiDAR Reveals a Vast Archaeological Landscape at the Nan Madol World Heritage Site. Remote Sens. 2019, 11, 2152. [Google Scholar] [CrossRef] [Green Version]
  16. Fernández-Lozano, J.; Gutiérrez-Alonso, G. Improving archaeological prospection using localized UAVs assisted photogrammetry: An example from the Roman Gold District of the Eria River Valley (NW Spain). J. Archaeol. Sci. Rep. 2016, 5, 509–520. [Google Scholar] [CrossRef]
  17. Troy, C.D.; Cheng, Y.T.; Lin, Y.C.; Habib, A. Rapid Lake Michigan shoreline changes revealed by UAV LiDAR surveys. Coast. Eng. 2021, 170, 104008. [Google Scholar] [CrossRef]
  18. Roca, D.; Armesto, J.; Lagüela, S.; Díaz-Vilariño, L. Lidar-equipped UAV for building information modelling. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 45, 523–527. [Google Scholar] [CrossRef] [Green Version]
  19. Bolourian, N.; Hammad, A. LiDAR-equipped UAV path planning considering potential locations of defects for bridge inspection. Autom. Constr. 2020, 117, 103250. [Google Scholar] [CrossRef]
  20. Velodyne LiDAR Puck VLP-16. Available online: https://pdf.aeroexpo.online/es/pdf-en/velodyne/puck-vlp-16/176220-7834.html#open (accessed on 20 October 2019).
  21. Matrice 600. Available online: https://www.dji.com/matrice600 (accessed on 7 November 2022).
  22. ODROID-XU4 Single Board Computer. Available online: https://sha256systems.eu/ODROID-XU4 (accessed on 20 October 2019).
  23. Estación total TOPCON ES 105. Available online: https://geotop.com.pe/producto/estacion-total/estacion-total-topcon/topcon-es105/ (accessed on 20 October 2019).
  24. Hiper VR GNSS Receiver. Available online: https://www.measurementsystems.org/Hiper%20VR%20GNSS%20Receiver (accessed on 20 October 2019).
  25. DJI Phantom 4 Pro. Available online: https://www.dji.com/phantom-4-pro?site=brandsite&from=nav (accessed on 20 October 2019).
  26. Christiansen, M.; Laursen, M.; Jørgensen, R.; Skovsen, S.; Gislum, R. Designing and testing a UAV mapping system for agricultural field surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [Green Version]
  27. Solidworks CAD 3D. Available online: https://www.solidworks.com/es/category/3d-cad (accessed on 6 March 2020).
  28. Tan, Y.; Li, G.; Cai, R.; Ma, J.; Wang, M. Mapping and modelling defect data from UAV captured images to BIM for building external wall inspection. Autom. Constr. 2022, 139, 104284. [Google Scholar] [CrossRef]
  29. Song, C.; Chen, Z.; Wang, K.; Luo, H.; Cheng, J.C. BIM-supported scan and flight planning for fully autonomous LiDAR-carrying UAVs. Autom. Constr. 2022, 142, 104533. [Google Scholar] [CrossRef]
  30. Velodyne_Pointcloud Package Summary. Available online: http://wiki.ros.org/velodyne_pointcloud?distro=indigo (accessed on 13 November 2019).
  31. Kannan, K. Development of a Reference Software Platform for the Velodyne VLP-16 LiDARS. Master’s Thesis, KTH Royal Institute of Technology, Stockholm, Sweden, October 2016. [Google Scholar]
  32. DJI SDK Package Summary. Available online: http://wiki.ros.org/dji_sdk (accessed on 13 November 2019).
  33. Bäumker, M.; Heimes, F. New Calibration and Computing Method for Direct Georeferencing of Image and Scanner Data Using the Position and Angular Data of a Hybrid Inertial Navigation System. In OEEPE Workshop; Integrated Sensor Orientation: Hannover, Germany, 2001; Volume 43, pp. 197–212. [Google Scholar]
  34. CloudCompare Version 2.6.1 User Manual. Available online: http://www.cloudcompare.org/doc/qCC/CloudCompare%20v2.6.1%20-%20User%20manual.pdf (accessed on 22 January 2020).
  35. DJI GS Pro. Available online: https://www.dji.com/ground-station-pro (accessed on 8 March 2020).
  36. Salinas Castillo, W.; Paredes Hernández, C.; Martínez Becerra, X.; Guevara Cortina, F. Evaluación de la exactitud posicional vertical de una nube de puntos topográficos Lidar usando topografía convencional como referencia. Investig. Geográficas 2014, 85, 5–17. [Google Scholar] [CrossRef] [Green Version]
  37. Gil, A.L.; Núñez-Casillas, L.; Isenburg, M.; Benito, A.A.; Bello, J.J.R.; Arbelo, M. A comparison between LiDAR and photogrammetry digital terrain models in a forest area on Tenerife Island. Can. J. Remote Sens. 2013, 39, 396–409. [Google Scholar]
  38. Besl, P.J.; McKay, N.D. A method for registration of 3-D shapes. IEEE Trans. Patter Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
Figure 1. Integration scheme developed for the LiDAR system.
Figure 1. Integration scheme developed for the LiDAR system.
Applsci 12 12838 g001
Figure 2. Implementation process flow for the proposed system.
Figure 2. Implementation process flow for the proposed system.
Applsci 12 12838 g002
Figure 3. Component drawings of Parts A, B, C, D, E, and F. (a) Part A: LiDAR sensor support structure and cover; (b) Part B: Mini Computer, LiPo battery and connection cable support tray; (c) Part C: connection between tray and LiDAR sensor support structure; (d) Part D: UAV rail support anchor; (e) Part E: support handles; (f) Part F: GPS support structure.
Figure 3. Component drawings of Parts A, B, C, D, E, and F. (a) Part A: LiDAR sensor support structure and cover; (b) Part B: Mini Computer, LiPo battery and connection cable support tray; (c) Part C: connection between tray and LiDAR sensor support structure; (d) Part D: UAV rail support anchor; (e) Part E: support handles; (f) Part F: GPS support structure.
Applsci 12 12838 g003
Figure 4. (a) Makerbot Replicator Plus printer; (b) PLA filament roll; (c) printed part.
Figure 4. (a) Makerbot Replicator Plus printer; (b) PLA filament roll; (c) printed part.
Applsci 12 12838 g004
Figure 5. Integration and parts of the LiDAR system with the Matrice 600 Pro UAV.
Figure 5. Integration and parts of the LiDAR system with the Matrice 600 Pro UAV.
Applsci 12 12838 g005
Figure 6. Marking of variables obtained by the total station.
Figure 6. Marking of variables obtained by the total station.
Applsci 12 12838 g006
Figure 7. LiDAR communications scheme for collecting data.
Figure 7. LiDAR communications scheme for collecting data.
Applsci 12 12838 g007
Figure 8. Reference frames used for the georeferenced point cloud generation.
Figure 8. Reference frames used for the georeferenced point cloud generation.
Applsci 12 12838 g008
Figure 9. Non-calibrated point cloud from the LiDAR reference system (x, y, z).
Figure 9. Non-calibrated point cloud from the LiDAR reference system (x, y, z).
Applsci 12 12838 g009
Figure 10. Non-calibrated point cloud RP in the UAV reference system (x′, y′, z′).
Figure 10. Non-calibrated point cloud RP in the UAV reference system (x′, y′, z′).
Applsci 12 12838 g010
Figure 11. Parameters and calibration equation.
Figure 11. Parameters and calibration equation.
Applsci 12 12838 g011
Figure 12. Point cloud PC calibrated using static variables in the UAV reference system (x′, y′, z′).
Figure 12. Point cloud PC calibrated using static variables in the UAV reference system (x′, y′, z′).
Applsci 12 12838 g012
Figure 13. Point cloud georeferenced in the UAV reference system (x″, y″, z″).
Figure 13. Point cloud georeferenced in the UAV reference system (x″, y″, z″).
Applsci 12 12838 g013
Figure 14. Conversion of vectors to the UTM reference system.
Figure 14. Conversion of vectors to the UTM reference system.
Applsci 12 12838 g014
Figure 15. Noise filtering process with tool selection and removal of isolated points. (a) Point cloud of the surveyed area before the noise filtering process; (b) point cloud of the surveyed area after the process.
Figure 15. Noise filtering process with tool selection and removal of isolated points. (a) Point cloud of the surveyed area before the noise filtering process; (b) point cloud of the surveyed area after the process.
Applsci 12 12838 g015
Figure 16. UAV Matrice 600 flight plan using the Ground Station Pro (GSP) software.
Figure 16. UAV Matrice 600 flight plan using the Ground Station Pro (GSP) software.
Applsci 12 12838 g016
Figure 17. UAV flight path in the UTM reference system.
Figure 17. UAV flight path in the UTM reference system.
Applsci 12 12838 g017
Figure 18. Point cloud display in the LiDAR 360 software, after removing outliers and filtering noise. Colors indicate the height range of the collected volumes.
Figure 18. Point cloud display in the LiDAR 360 software, after removing outliers and filtering noise. Colors indicate the height range of the collected volumes.
Applsci 12 12838 g018
Figure 19. Control points for validation. (a) Area points in orthophoto; (b) Area points in plan.
Figure 19. Control points for validation. (a) Area points in orthophoto; (b) Area points in plan.
Applsci 12 12838 g019
Figure 20. LiDAR point correction for the X and Y axes.
Figure 20. LiDAR point correction for the X and Y axes.
Applsci 12 12838 g020
Figure 21. LiDAR point correction for the Z axis.
Figure 21. LiDAR point correction for the Z axis.
Applsci 12 12838 g021
Figure 22. Photogrammetry point cloud (upper image) and LiDAR point cloud (lower image).
Figure 22. Photogrammetry point cloud (upper image) and LiDAR point cloud (lower image).
Applsci 12 12838 g022
Table 1. Equipment specifications.
Table 1. Equipment specifications.
Puck Lite Velodyne VLP-16 Laser Scanner
Channels16Applsci 12 12838 i001
Measurement Range/Accuracy100 m/±3 cm
Field of View (Vertical/Horizontal)−15° to 15°/360°
Angular Resolution (Vertical/Horizontal)2°/0.1°–0.4°
Rotation Rate5 Hz–20 Hz
Laser (Classification/Wavelength)Class 1/160 mm
Measurement Rate (Single/Dual Return Mode)~30,000 pps/~600,000 pps
Power Consumption/Weight8 W/800 g[30]
Matrice 600 Pro UAV
Weight/Maximum Payload Recommended9.5 kg/6 kgApplsci 12 12838 i002
Hovering Time (No Payload/Full Payload)32 min/16 min
Maximum Travel/Ascent/Descent Speeds18 m/s, 5 m/s, 3 m/s
Flight Control SystemA3 Pro
Position Accuracy (x, y/z)<3 m/<5 m
Orientation Accuracy (pitch, roll/yaw)<1°/<3°
Power Consumption (6 tb47s Batteries)599.4 Wh[31]
Odroid XU4 Mini Computer
ProcessorCortex A15 (2 Ghz)/Cortex A7 (8 cores)Applsci 12 12838 i003
Graphics Processor/RAMARM Mali T628 MP6/2 GB LPDDR3
Communication Ports2× USB 3.0/1× USB 2.0/Gigabit Ethernet
Storage DeviceeMMC (up to 64 GB)/SD (up to 128 GB)[32]
Table 2. 3D Printing Materials Comparison.
Table 2. 3D Printing Materials Comparison.
PLAABS
Derived FromCorn starch or sugar canePetroleum
Melting Point145 °C–160 °C225 °C–245 °C
Sensitivity to Temperature ChangeLowHigh
Smoke IntensityLowHigh
Tensile Modulus of Elasticity2346.5 MPa1681.5 MPa
Tensile Strength at Yield49.5 MPa39.0 MPa
Tensile Strength at Break45.6 MPa33.9 MPa
Strain at Yield3.3%3.5%
Flexural Strength103.0 MPa70.5 MPa
Hardness83 (Shore D)76 (Shore D)
Table 3. Static and dynamic variables of the georeferencing equation.
Table 3. Static and dynamic variables of the georeferencing equation.
Static VariablesDynamic Variables
R c a l P t g
R s b R b g t
a c a l p t s
l b
Table 4. Parameter equivalence for the point cloud calibration process.
Table 4. Parameter equivalence for the point cloud calibration process.
Calibration EquationGeoreferencing Equation
R_al R c a l
R_bs2bi R s b
R_bs p t s
a_al a c a l
a_bi l b
Table 5. Parameter equivalence for Algorithm 1.
Table 5. Parameter equivalence for Algorithm 1.
Calibration EquationGeoreferencing Equation
r_enu p t g
p_enu_i P t g
R_i R b g t
Table 6. Flight parameters.
Table 6. Flight parameters.
Flight Parameters
Flight Lines5
Altitude40 m
Flight Time3 min
Flying Speed3.6 m/s
Cover Area0.44 Ha
Front Overlap81%
Side Overlap87%
Table 7. Coordinates of the surveyed points obtained with a total station, LiDAR and photogrammetry.
Table 7. Coordinates of the surveyed points obtained with a total station, LiDAR and photogrammetry.
Total Station (Reference)LiDARPhotogrammetry
Pointxyzxyzxyz
T18,663,213.326285,605.3695255,1538,663,211.895285,608.022219,4318,663,213.268285,605.359255,333
T28,663,206.688285,612.849256,8288,663,205.159285,615.639221,3268,663,206.696285,612.849256,849
T38,663,200.097285,620.306255.228,663,198.054285,622.75219,6988,663,200.119285,620.300255,265
T48,663,151.505285,577.635255,2658,663,149.748285,579.502220,1498,663,151.5285,577.631255,188
T58,663,158.034285,570.161256,7658,663,156.595285,572.131221,6088,663,158.031285,570.151256,738
T68,663,164.644285,562.647255,2528,663,163.092285,564.911219.898,663,164.641285,562.663255,434
Table 8. Absolute error as a function of the difference in the control point coordinates between the methodologies applied, taking the total station survey as reference.
Table 8. Absolute error as a function of the difference in the control point coordinates between the methodologies applied, taking the total station survey as reference.
Control Point Errors
LiDARPhotogrammetry
PointΔX (m)ΔY (m)ΔZ (m)ΔX (m)ΔY (m)ΔZ (m)
T11.43192.652535.72200.05890.0105−0.1800
T21.52872.790335.5020−0.00800.0000−0.0210
T32.04322.443735.5220−0.02200.0060−0.0450
T41.75711.867035.11600.00500.00400.0770
T51.43931.970135.15700.00300.01000.0270
T61.55222.263635.36200.0030−0.0160−0.1820
RMSE (m)0.114 RMSE (m)35.514
Table 9. Corrected control point coordinates.
Table 9. Corrected control point coordinates.
Control Points
Total Station (Reference)LiDAR
Pointxyzxyz
T18,663,213.326285,605.3695255.1538,663,213.619285,606.155254.431
T28,663,206.688285,612.849256.8288,663,206.883285,613.772256.326
T38,663,200.097285,620.306255.228,663,199.778285,620.883254.698
T48,663,151.505285,577.635255.2658,663,151.472285,577.635255.149
T58,663,158.034285,570.161256.7658,663,158.319285,570.264256.608
T68,663,164.644285,562.647255.2528,663,164.816285,563.044254.89
Table 10. Corrected control point coordinate errors.
Table 10. Corrected control point coordinate errors.
Control Point Errors
LiDAR
PointΔX (m)ΔY (m)ΔZ (m)RMSE (m)
T1−0.2921−0.78550.7220
T2−0.1950−0.92300.5020
T30.3190−0.57700.5220
T40.03300.00000.1160
T5−0.2850−0.10300.1570
T6−0.1720−0.39700.36200.766
Table 11. Distance errors between the LiDAR and UAV photogrammetry Surveys.
Table 11. Distance errors between the LiDAR and UAV photogrammetry Surveys.
Point CloudMeasure (m)
T1-T2T2-T3T3-T4T4-T5T5-T6T6-T1RMSE (m)
Total Station (Reference)9.95210.00164.77110.0089.92464.668
LiDAR10.05210.16865.1189.71310.0664.837
Error between TS and LiDAR−0.100−0.167−0.3470.295−0.136−0.1690.171
UAV Photogrammetry9.9399.96564.7119.9889.9364.687
Error between TS and UAV Photogrammetry0.0130.0360.0600.020−0.006−0.1900.024
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Del Savio, A.A.; Luna Torres, A.; Chicchón Apaza, M.A.; Vergara Olivera, M.A.; Llimpe Rojas, S.R.; Urday Ibarra, G.T.; Reyes Ñique, J.L.; Macedo Arevalo, R.I. Integrating a LiDAR Sensor in a UAV Platform to Obtain a Georeferenced Point Cloud. Appl. Sci. 2022, 12, 12838. https://doi.org/10.3390/app122412838

AMA Style

Del Savio AA, Luna Torres A, Chicchón Apaza MA, Vergara Olivera MA, Llimpe Rojas SR, Urday Ibarra GT, Reyes Ñique JL, Macedo Arevalo RI. Integrating a LiDAR Sensor in a UAV Platform to Obtain a Georeferenced Point Cloud. Applied Sciences. 2022; 12(24):12838. https://doi.org/10.3390/app122412838

Chicago/Turabian Style

Del Savio, Alexandre Almeida, Ana Luna Torres, Miguel Angel Chicchón Apaza, Mónica Alejandra Vergara Olivera, Sara Rocío Llimpe Rojas, Gianella Tania Urday Ibarra, José Luis Reyes Ñique, and Rolando Issac Macedo Arevalo. 2022. "Integrating a LiDAR Sensor in a UAV Platform to Obtain a Georeferenced Point Cloud" Applied Sciences 12, no. 24: 12838. https://doi.org/10.3390/app122412838

APA Style

Del Savio, A. A., Luna Torres, A., Chicchón Apaza, M. A., Vergara Olivera, M. A., Llimpe Rojas, S. R., Urday Ibarra, G. T., Reyes Ñique, J. L., & Macedo Arevalo, R. I. (2022). Integrating a LiDAR Sensor in a UAV Platform to Obtain a Georeferenced Point Cloud. Applied Sciences, 12(24), 12838. https://doi.org/10.3390/app122412838

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop