**1. Introduction**

Currently, several Lidar manufacturers are found around the world where their business is growing with the high demand for the Lidar products in different industries. The major industry pushing Lidar companies for continuing development and being competitive is the thriving autonomous navigation/driving industry [1–4]. On the other hand, mapping companies are benefiting from this continuous development to use the Lidars as active scanning devices from the ground based or aerial based systems.

Lidar available types are varied in many geometric and radiometric aspects and this has an impact on the final aim of application and the selling prices as well. The most common Lidar types are multi beam time-of-flight TOF scanners graduating from 8 up to 128 beams, which are spinning devices at a high rotation speed of 5–30 Hz [5–8]. Solid state Lidars are also available with a limited field of view FOV like the Luminar H-series type [9] and Blickfeld [10]. Furthermore, single beam Lidars are available in the market with high geometric properties of range accuracy, beam divergence, angular resolution, etc., such as Riegel VUX H [11].

Since mobile mapping applications have different requirements than autonomous navigation, the orientation of the Lidar is not necessarily posed vertically on its base on the car without tilt. Accordingly, to have a better density and coverage of the scanned features such as road surfaces, it is necessary to tilt the Lidar in a mobile mapping system (MMS).

However, finding the ideal orientation angle of the Lidar device that fits the mapping requirements is not an easy task especially with the variety of the offered Lidar characteristics by the manufacturers [12,13]. Furthermore, the angular orientation determination is vital and more complicated when placing more than one Lidar in a mapping system to increase the production amount of data. *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 2 of 20 However, finding the ideal orientation angle of the Lidar device that fits the mapping requirements is not an easy task especially with the variety of the offered Lidar characteristics by the manufacturers [12,13]. Furthermore, the angular orientation determination is vital and more

Confirming point clouds coverage and density are common quality assurance QA and quality control QC tasks in workflows that comprise the processing of Lidar data [14,15]. Thus, in this paper, the ideal orientation of the Lidar device is estimated based on the production efficiency in terms of density of points, coverage amount, and the predicted accuracy. complicated when placing more than one Lidar in a mapping system to increase the production amount of data. Confirming point clouds coverage and density are common quality assurance QA and quality control QC tasks in workflows that comprise the processing of Lidar data [14,15]. Thus, in this paper,

For mobile mapping and geospatial applications, the density of points on road surfaces is of a high demand [16,17] beside the density and coverage on off-ground features like building facades in the urban environments. The major mobile mapping applications for: engineering surveys, digital terrain modeling, clearance computations, road asset inventory, drainage analysis, virtual three-dimensional (3D) design, as-built documentation and structural inspection requires in general high point densities (>100 points in one squared meter pts/m<sup>2</sup> ), and high accuracies (<5 cm) [16]. Furthermore, the required mapping coverage is governed among many factors by the Lidar scanning range, FOV, and the number of the installed Lidar tools in the mobile mapping system. the ideal orientation of the Lidar device is estimated based on the production efficiency in terms of density of points, coverage amount, and the predicted accuracy. For mobile mapping and geospatial applications, the density of points on road surfaces is of a high demand [16,17] beside the density and coverage on off-ground features like building facades in the urban environments. The major mobile mapping applications for: engineering surveys, digital terrain modeling, clearance computations, road asset inventory, drainage analysis, virtual threedimensional (3D) design, as-built documentation and structural inspection requires in general high point densities (>100 points in one squared meter pts/m2), and high accuracies (<5 cm) [16]. Furthermore, the required mapping coverage is governed among many factors by the Lidar scanning

It should be noted that the density, accuracy, and coverage characteristics sometimes contradict to each other. As an example, Lidars which have a maximum range up to 250 m can increase the coverage compared to Lidars with a maximum of 100 m scanning range. However, this result in a higher beam divergence and range uncertainty which should be counted. range, FOV, and the number of the installed Lidar tools in the mobile mapping system. It should be noted that the density, accuracy, and coverage characteristics sometimes contradict to each other. As an example, Lidars which have a maximum range up to 250 m can increase the coverage compared to Lidars with a maximum of 100 m scanning range. However, this result in a higher beam divergence and range uncertainty which should be counted.

Accordingly, the best orientation angle of the Lidar is the one producing a compromised higher number of points on the road surface and facades and ensures a high percentage of coverage and accuracy. Accordingly, the best orientation angle of the Lidar is the one producing a compromised higher number of points on the road surface and facades and ensures a high percentage of coverage and

An investigation of two newly released state-of-the-art 64 channel Lidar types is applied to quantify the ideal angular orientation that fits for mobile mapping applications. accuracy. An investigation of two newly released state-of-the-art 64 channel Lidar types is applied to

Both Lidar types are multi beam spinning Lidar types with 64 channels and represent a potential Lidar candidate for MMS especially with their reasonable size, weight, and cost. These two Lidars are lately released from different manufacturers in China and USA. The first Lidar is the Hesai Pandar 64 (Figure 1a) and the second Lidar is the Ouster OS-1-64 (Figure 1b). quantify the ideal angular orientation that fits for mobile mapping applications. Both Lidar types are multi beam spinning Lidar types with 64 channels and represent a potential Lidar candidate for MMS especially with their reasonable size, weight, and cost. These two Lidars are lately released from different manufacturers in China and USA. The first Lidar is the Hesai Pandar 64 (Figure 1a) and the second Lidar is the Ouster OS-1-64 (Figure 1b).

**Figure 1.** (**a**) Pandar64 [11]. (**b**) Ouster Lidar types [13,15]. **Figure 1.** (**a**) Pandar64 [11]. (**b**) Ouster Lidar types [13,15].

Pandar64 has a longer maximum range of 200 m at 10% target reflectivity while having an irregular beams distribution or what may be called gradient distribution. On the other hand, Ouster Pandar64 has a longer maximum range of 200 m at 10% target reflectivity while having an irregular beams distribution or what may be called gradient distribution. On the other hand, Ouster OS-1-64 has a shorter range of 120 m @ 80% target reflectivity with linear distribution of beams and smaller

beam divergence. The specifications of every Lidar type are summarized in Table 1, which shows that every type has its own advantages. Initially and based on the specifications, they are both suitable for high productivity mapping applications. A thorough investigation of their efficiency will be presented in Section 4. suitable for high productivity mapping applications. A thorough investigation of their efficiency will be presented in Section 4. **Table 1.** Specifications of the investigated Lidar types.

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 3 of 20

OS-1-64 has a shorter range of 120 m @ 80% target reflectivity with linear distribution of beams and

shows that every type has its own advantages. Initially and based on the specifications, they are both


**Table 1.** Specifications of the investigated Lidar types. **Lidar Sensor/System Ouster HESAI** Type/version OS-1 -64 Pandar64

In Figure 2, the beam distribution pattern is given for both Lidars as published by their manufacturers [18,19]. Furthermore, to give the reader a visual impression about how the scans are expressed for each Lidar type, a simulation scanning space surrounded by walls is applied. Every Lidar type is placed at the center of the space, while the Lidar is placed on its base without a tilt in a stationary scan of one revolution. In Figure 2, the beam distribution pattern is given for both Lidars as published by their manufacturers [18,19]. Furthermore, to give the reader a visual impression about how the scans are expressed for each Lidar type, a simulation scanning space surrounded by walls is applied. Every Lidar type is placed at the center of the space, while the Lidar is placed on its base without a tilt in a stationary scan of one revolution.

**Figure 2.** Scanning patterns of the two Lidar types. (**a**) Pandar64 Lidar. (**b**) OS-1-64 Lidar.

#### **2. Methods** the Lidar performance will be given. Namely, the angular orientations, scanning patterns, and the

In this section, the method including evaluation parameters and simulation assumptions about the Lidar performance will be given. Namely, the angular orientations, scanning patterns, and the assessment of density, accuracy, and coverage. Figure 3 illustrates the methodology workflow. For both investigated Lidar types, the scanning routine is built based on the scanning properties given by the manufacturers. Then, a trajectory is defined in a 3D simulated urban model where a virtual mobile mapping system equipped with a single Lidar is mounted in a vehicle driving along the trajectory. It should be noted that the simulations are applied using the Blender tool [20]. assessment of density, accuracy, and coverage. Figure 3 illustrates the methodology workflow. For both investigated Lidar types, the scanning routine is built based on the scanning properties given by the manufacturers. Then, a trajectory is defined in a 3D simulated urban model where a virtual mobile mapping system equipped with a single Lidar is mounted in a vehicle driving along the trajectory. It should be noted that the simulations are applied using the Blender tool [20]. In every run, a new tilting angle is initiated to orient the Lidar device and then a simulated scanning is applied. The produced point clouds in every angle set up is analyzed in terms of coverage, density, and accuracy and then a final performance evaluation is concluded.

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 4 of 20

**Figure 2.** Scanning patterns of the two Lidar types. (**a**) Pandar64 Lidar. (**b**) OS-1-64 Lidar.

In this section, the method including evaluation parameters and simulation assumptions about

**Figure 3.** Methodology workflow. **Figure 3.** Methodology workflow.

*2.1. Lidar Angular Set Up* In this research paper, the ideal angular orientation of the two Lidar types is investigated to determine their suitability for the mobile mapping applications. The investigated angular range of In every run, a new tilting angle is initiated to orient the Lidar device and then a simulated scanning is applied. The produced point clouds in every angle set up is analyzed in terms of coverage, density, and accuracy and then a final performance evaluation is concluded.

#### the Lidar is defined as shown in Figure 4. *2.1. Lidar Angular Set Up*

0 In this research paper, the ideal angular orientation of the two Lidar types is investigated to determine their suitability for the mobile mapping applications. The investigated angular range of the Lidar is defined as shown in Figure 4.

As shown in Figure 4a, the 0◦ tilting angle (blue color) is defined when the Lidar is placed on its base, and then the Lidar scanning lines will constitute concentric circles on the ground (Figure 4b). This 0◦ Lidar orientation is the mostly used in autonomous driving. On the other hand, when the Lidar is rotated 90◦ (red in Figure 4a), the scans will hit the ground surface close to right angles and result in a parallel scan line pattern (Figure 4b).

90

**Figure 4.** (**a**) Lidar tilt angles illustration. (**b**) Patterns of the Lidar beams at 0° (blue) and 90° (red) angles on a road surface from a top view. **Figure 4.** (**a**) Lidar tilt angles illustration. (**b**) Patterns of the Lidar beams at 0◦ (blue) and 90◦ (red) angles on a road surface from a top view.

#### As shown in Figure 4a, the 0° tilting angle (blue color) is defined when the Lidar is placed on its *2.2. Accuracy Assessment*

*2.2. Accuracy Assessment* 

system at time , and

accuracy.

base, and then the Lidar scanning lines will constitute concentric circles on the ground (Figure 4b). This 0° Lidar orientation is the mostly used in autonomous driving. On the other hand, when the Lidar is rotated 90° (red in Figure 4a), the scans will hit the ground surface close to right angles and result in a parallel scan line pattern (Figure 4b). Accuracy assessment is an important measure to check the quality of the scanned objects at the different tilting angles of the Lidar. This estimation of accuracy can be achieved by applying the Lidar error model of Equation (1) [21]:

$$P\_i^w(t) = M\_{\rm INS}^w(t) P\_i^{\rm INS}(t) + T\_{\rm INS}^w(t) \tag{1}$$

Accuracy assessment is an important measure to check the quality of the scanned objects at the different tilting angles of the Lidar. This estimation of accuracy can be achieved by applying the Lidar error model of Equation (1) [21]: where *P w i* (*t*) represents the 3D coordinate of a point (*X<sup>i</sup>* , *Y<sup>i</sup>* , *Zi*) in the world coordinate system at time *t*, *T w INS*(*t*) is the position of the inertial navigation system INS in the world coordinate system at time *t*, *M<sup>w</sup> INS*(*t*) is the rotation matrix between the INS body frame and the world coordinate system at time *t*, and *P INS i* (*t*) is the position of the target point in the INS body frame at time *t*.

 ௪ሺሻ ൌ ூேௌ <sup>௪</sup> ሺሻ ூேௌሺሻ ூேௌ <sup>௪</sup> ሺሻ (1) where ௪ሺሻ represents the 3D coordinate of a point ሺ, , ሻ in the world coordinate system at time , ூேௌ <sup>௪</sup> ሺሻ is the position of the inertial navigation system INS in the world coordinate system at time , ூேௌ <sup>௪</sup> ሺሻ is the rotation matrix between the INS body frame and the world coordinate However, a pre-calibrated Lidar device is assumed in this study and as a result, errors in the lever arm offset, and boresight angles are assumed insignificant. Accordingly, the error propagation equation model is simplified where the position of every scanned point *X<sup>i</sup>* , *Y<sup>i</sup>* , *Z<sup>i</sup>* at time *t* can be formulated as in Equation (2):

$$\begin{array}{l} X\_{i} = \ X\_{1} + R\cos(Az)\cos(V) \\ Y\_{i} = \ Y\_{1} + R\sin(Az)\cos(V) \\ Z\_{i} = \ Z\_{1} + R\sin(V) \end{array} \tag{2}$$

equation model is simplified where the position of every scanned point , , at time can be formulated as in Equation (2): ൌ ଵ ሺሻ ሺሻ where *R* is the measured range distance from the Lidar to the object point *P*, *Az* is the measured azimuth angle of the laser beam, *V* is the vertical angle of the laser beam measured from the horizon, and *X*1, *Y*1, and *Z*<sup>1</sup> are the coordinates of the Lidar sensor at *t*.

 ൌ ଵ ሺሻ ሺሻ ൌ ଵ ሺሻ (2) It should be noted that the errors of the Lidar coordinates *X*1, *Y*1, *Z*<sup>1</sup> are predicted in normal cases from an integrated INS. If the accuracy in the navigated Lidar coordinates is not included in the model of Equation (2), then the estimated errors of the scanned points will represent the relative accuracy.

where is the measured range distance from the Lidar to the object point , is the measured azimuth angle of the laser beam, is the vertical angle of the laser beam measured from the horizon, An error propagation can be applied using Jacobian Matrix *J* to estimate the errors of the scanned points in a point cloud as follows in Equations (3) and (4) [22]:

$$J = \begin{bmatrix} \frac{\partial \text{Xi}}{\partial dz} & \frac{\partial \text{Xi}}{\partial V} & \frac{\partial \text{Xi}}{\partial R} \\ \frac{\partial \text{Yi}}{\partial dz} & \frac{\partial \text{Yi}}{\partial V} & \frac{\partial \text{Yi}}{\partial R} \\ \frac{\partial \text{Zi}}{\partial dz} & \frac{\partial \text{Zi}}{\partial V} & \frac{\partial \text{Zi}}{\partial R} \end{bmatrix} = \begin{bmatrix} -R\sin(A)\cos(V) & -R\cos(A)\sin(V) & \cos(A)\cos(V) \\ R\cos(A)\cos(V) & -R\sin(A)\sin(V) & \sin(A)\cos(V) \\ 0 & R\cos(V) & \sin(V) \end{bmatrix} \tag{3}$$

$$\sum\_{\text{XYZ}} = f \sum\_{\text{R},d \ge,V} f^l = \begin{bmatrix} \sigma\_{\text{X}\text{i}}^{\text{-2}} & & \\ & \sigma\_{\text{Y}\text{i}}^{\text{-2}} & \\ & & \sigma\_{\text{Z}\text{i}}^{\text{-2}} \end{bmatrix} \tag{4}$$

where P *<sup>R</sup>*,*Az*,*<sup>V</sup>* is the variance-covariance matrix of the scanned range and angles to the point cloud, P *XYZ* is the variance-covariance matrix of the derived coordinates of the point cloud, σ*Xi*, σ*Yi*, σ*Zi* are the standard deviations of point *i* coordinates, and *t* is the matrix transpose. *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 6 of 20 *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 6 of 20 For illustration, exaggerated ellipsoids of errors are shown in Figure 5 for the scan sweep at 30˚

For illustration, exaggerated ellipsoids of errors are shown in Figure 5 for the scan sweep at 30◦ of a multi beam Lidar on the ground. Figure 5 illustrates how errors are larger for scanned ground points at longer scanning ranges away from the Lidar. For illustration, exaggerated ellipsoids of errors are shown in Figure 5 for the scan sweep at 30˚ of a multi beam Lidar on the ground. Figure 5 illustrates how errors are larger for scanned ground points at longer scanning ranges away from the Lidar. of a multi beam Lidar on the ground. Figure 5 illustrates how errors are larger for scanned ground points at longer scanning ranges away from the Lidar.

**Figure 5.** Errors representation of a scanned road surface by a single multi beam Lidar mounted at the rear part of a car roof. **Figure 5.** Errors representation of a scanned road surface by a single multi beam Lidar mounted at the rear part of a car roof. **Figure 5.** Errors representation of a scanned road surface by a single multi beam Lidar mounted at the rear part of a car roof.

#### *2.3. Density Assessment 2.3. Density Assessment 2.3. Density Assessment*

Computing the point density is one of the performance evaluation categories in this paper and it describes the number of points in one squared meter (pts/m<sup>2</sup> ). The density computation will be applied simply by finding the nearest neighbor points of every scanned point within a searching radius and sort their number as density. Computing the point density is one of the performance evaluation categories in this paper and it describes the number of points in one squared meter (pts/m<sup>2</sup> ). The density computation will be applied simply by finding the nearest neighbor points of every scanned point within a searching radius and sort their number as density. Computing the point density is one of the performance evaluation categories in this paper and it describes the number of points in one squared meter (pts/m<sup>2</sup> ). The density computation will be applied simply by finding the nearest neighbor points of every scanned point within a searching radius and sort their number as density.

As there will be a definite opportunity for having empty space between measurements, it is also possible to describe density by the point spacing [23]. Spacing between points can be calculated from the density as in Equation (5) [16]: As there will be a definite opportunity for having empty space between measurements, it is also possible to describe density by the point spacing [23]. Spacing between points can be calculated from the density as in Equation (5) [16]: As there will be a definite opportunity for having empty space between measurements, it is also possible to describe density by the point spacing [23]. Spacing between points can be calculated from the density as in Equation (5) [16]:

$$\text{sample spacing} = \sqrt{\frac{1}{\text{point density}}}\tag{5}$$

However, this simple equation assumes a uniform distribution of points which is not the case with the two Lidar types under investigation. Worth to mention is that in this paper, point clouds were simulated assuming one pulse return and this also complied with the US Geological Survey standards. However, this simple equation assumes a uniform distribution of points which is not the case with the two Lidar types under investigation. Worth to mention is that in this paper, point clouds were simulated assuming one pulse return and this also complied with the US Geological Survey standards. with the two Lidar types under investigation. Worth to mention is that in this paper, point clouds were simulated assuming one pulse return and this also complied with the US Geological Survey standards.

However, this simple equation assumes a uniform distribution of points which is not the case

#### *2.4. Coverage Assessment 2.4. Coverage Assessment*

point cloud.

*2.4. Coverage Assessment* Coverage or completeness is one of the important quality measures of point clouds. Although this is a difficult to define task for ground-based mapping systems, it is possible to assess in this Coverage or completeness is one of the important quality measures of point clouds. Although this is a difficult to define task for ground-based mapping systems, it is possible to assess in this paper, since the reference simulated urban 3D model is well defined. Coverage or completeness is one of the important quality measures of point clouds. Although this is a difficult to define task for ground-based mapping systems, it is possible to assess in this paper, since the reference simulated urban 3D model is well defined.

paper, since the reference simulated urban 3D model is well defined. The coverage percentage is computed by subtracting the scanned point cloud (blue color) from the reference model point cloud (gray color) as shown in Figure 6. First, the proximity between the scanned points and the reference evenly spaced points were computed. Then, only assigned points in the reference point cloud were considered for assessing the coverage against the whole reference The coverage percentage is computed by subtracting the scanned point cloud (blue color) from the reference model point cloud (gray color) as shown in Figure 6. First, the proximity between the scanned points and the reference evenly spaced points were computed. Then, only assigned points in the reference point cloud were considered for assessing the coverage against the whole reference point cloud. The coverage percentage is computed by subtracting the scanned point cloud (blue color) from the reference model point cloud (gray color) as shown in Figure 6. First, the proximity between the scanned points and the reference evenly spaced points were computed. Then, only assigned points in the reference point cloud were considered for assessing the coverage against the whole reference point cloud.

**Figure 6.** *Cont*.

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 7 of 20

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 7 of 20

**Figure 6.** Coverage evaluation where blue color represents the scanned point cloud subtracted out of the simulated reference gray point cloud. **Figure 6.**Coverage evaluation where blue color represents the scanned point cloud subtracted out of the simulated reference gray point cloud. **Figure 6.** Coverage evaluation where blue color represents the scanned point cloud subtracted out of the simulated reference gray point cloud.

#### *2.5. Scanning Pattern Assessment 2.5. Scanning Pattern Assessment 2.5. Scanning Pattern Assessment 2.5. Scanning Pattern Assessment*

To decide which orientation angels can be investigated in the experiment of the two Lidar types, we applied a scanning simulation at an angular range of 0°–90 at a 10 interval. This is simply simulated by placing every Lidar at 2 m height in an MMS where a revolution scan is applied to a planar ground and a façade (Figure 7). To decide which orientation angels can be investigated in the experiment of the two Lidar types, we applied a scanning simulation at an angular range of 0◦–90◦ at a 10◦ interval. This is simply simulated by placing every Lidar at 2 m height in an MMS where a revolution scan is applied to a planar ground and a façade (Figure 7). we applied a scanning simulation at an angular range of 0°–90 at a 10 interval. This is simply To decide which orientation angels can be investigated in the experiment of the two Lidar types, we applied a scanning simulation at an angular range of 0°–90 at a 10 interval. This is simply simulated by placing every Lidar at 2 m height in an MMS where a revolution scan is applied to a planar ground and a façade (Figure 7).

**Figure 7.** A simulated scan on the ground and aside facade. **Figure 7.** A simulated scan on the ground and aside facade.

The pattern of a stationary scanning sweep on the ground for the two Lidar types of the selected tilt angles from 0°–90 is shown in (Figure 8) with a scanning range of 20 m. The pattern of a stationary scanning sweep on the ground for the two Lidar types of the selected The pattern of a stationary scanning sweep on the ground for the two Lidar types of the selected tilt angles from 0◦–90◦ is shown in (Figure 8) with a scanning range of 20 m. The pattern of a stationary scanning sweep on the ground for the two Lidar types of the selected tilt angles from 0°–90 is shown in (Figure 8) with a scanning range of 20 m.

**Figure 8.** *Cont*.

0 10 20 30 40

0 10 20 30 40

(**a**)

50 60 70 80 90

**Figure 6.** Coverage evaluation where blue color represents the scanned point cloud subtracted out of

To decide which orientation angels can be investigated in the experiment of the two Lidar types, we applied a scanning simulation at an angular range of 0°–90 at a 10 interval. This is simply simulated by placing every Lidar at 2 m height in an MMS where a revolution scan is applied to a

**Figure 7.** A simulated scan on the ground and aside facade.

0 10 20 30 40

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 8 of 20

tilt angles from 0°–90 is shown in (Figure 8) with a scanning range of 20 m.

The pattern of a stationary scanning sweep on the ground for the two Lidar types of the selected

the simulated reference gray point cloud.

*2.5. Scanning Pattern Assessment* 

planar ground and a façade (Figure 7).

**Figure 8.** Stationary scanning patterns at a scanning range of 20 m on the ground. (**a**) OS-1-64. (**b**) Pandar64. **Figure 8.** Stationary scanning patterns at a scanning range of 20 m on the ground. (**a**) OS-1-64. (**b**) Pandar64. histograms in Figure 9 is the inverse relation between the density and accuracy on one side versus coverage from the other side. Accordingly, a compromise of these contradicting properties has been decided by considering the tilting angle range of 30°–50°. This angular orientation range is in general

Both Lidar types scanning patterns shown in Figure 8 are analyzed in terms of density, coverage, and accuracy to get the performance histograms shown in Figure 9. Both Lidar types scanning patterns shown in Figure 8 are analyzed in terms of density, coverage, and accuracy to get the performance histograms shown in Figure 9. the most suitable for experimenting and evaluating the two Lidars performance for mobile mapping applications as will be presented in Section 4.

(**b**) **Figure 9. Figure 9.** Performance test of the individual scans. ( Performance test of the individual scans. (**aa** ) OS-1-64. ( ) OS-1-64. (**b b**) ) Pandar64. Pandar64.

A slight difference between OS-1-64 and Pandar64 is found in the coverage aspect. Whenever the tilt angle increases greater than 10◦ , OS-1-64 coverage scanning footprint decreases. On the other hand,

(**b**) **Figure 9.** Performance test of the individual scans. (**a**) OS-1-64. (**b**) Pandar64.

for Pandar64, the coverage area decreases constantly when the tilting angle is greater than 25◦ . This might be related to: (a) the symmetry of the vertical FOV around the horizon and (b) the distribution pattern of the beams as illustrated in Figure 2. On the other hand, both Lidars have the same behavior for density and accuracy magnitudes. What can be clearly identified from the histograms in Figure 9 is the inverse relation between the density and accuracy on one side versus coverage from the other side. Accordingly, a compromise of these contradicting properties has been decided by considering the tilting angle range of 30◦–50◦ . This angular orientation range is in general the most suitable for experimenting and evaluating the two Lidars performance for mobile mapping applications as will be presented in Section 4. *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 9 of 20 **3. Experiment Design** The experimental test was applied in a simulated urban environment (Figure 10) where different buildings, light poles, traffic lights, moving cars, and trees are existed. The simulated drive was applied to a mobile mapping system equipped with the mentioned 64-channel Lidars of OS-1-64 and Pandar64 at a height of 2 m above the ground with a constant driving speed of 36 km/h. Five tilting angles of 30°, 35°, 40°, 45°, and 50° were investigated in the scanning for both Lidars.

#### **3. Experiment Design** It should be noted that the simulations omitted any limited FOV and the existence of occlusions that

The experimental test was applied in a simulated urban environment (Figure 10) where different buildings, light poles, traffic lights, moving cars, and trees are existed. The simulated drive was applied to a mobile mapping system equipped with the mentioned 64-channel Lidars of OS-1-64 and Pandar64 at a height of 2 m above the ground with a constant driving speed of 36 km/h. might occur because of the vehicle body or the camera system itself and assumed a perfect occlusionfree mobile mapping system. The accuracy evaluations of the two Lidars are represented by standard deviations derived in the propagation of error as described in Section 2.2. A standard deviation of 3 cm in range and 0.1° in angles were used in the calculations related to the manufacturers' specifications.

**Figure 10.** Virtual reality simulation urban model. **Figure 10.** Virtual reality simulation urban model.

tilting angles of both Lidars as will be shown in Section 4.

Five tilting angles of 30◦ , 35◦ , 40◦ , 45◦ , and 50◦ were investigated in the scanning for both Lidars. It should be noted that the simulations omitted any limited FOV and the existence of occlusions that might occur because of the vehicle body or the camera system itself and assumed a perfect occlusion-free mobile mapping system.

The accuracy evaluations of the two Lidars are represented by standard deviations derived in the propagation of error as described in Section 2.2. A standard deviation of ±3 cm in range and ±0.1◦ in angles were used in the calculations related to the manufacturers' specifications.

Two density checks are applied: (1) overall ground and off ground density check using the CloudCompare tool [24]; (2) density check on two selected slices on the ground (20 m length) and off ground (40 m height) as shown in Figure 11. The density tests are applied for all the aforementioned tilting angles of both Lidars as will be shown in Section 4. *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 10 of 20 *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 10 of 20

**Figure 11.** Selected testing point cloud slices in blue. (**a**) Ground slice. (**b**) Facade slice**. Figure 11.** Selected testing point cloud slices in blue. (**a**) Ground slice. (**b**) Facade slice. **Figure 11.** Selected testing point cloud slices in blue. (**a**) Ground slice. (**b**) Facade slice**.**

A third test type was applied to check the density achieved on a traffic board sign located at distance of five meters from the trajectory, which increased as the car turned to the left road in the driving scenario (Figure 12). The test checked the effectiveness of cylindrical posts detection at the different tilting angles for the two Lidar types. A third test type was applied to check the density achieved on a traffic board sign located at distance of five meters from the trajectory, which increased as the car turned to the left road in the driving scenario (Figure 12). The test checked the effectiveness of cylindrical posts detection at the different tilting angles for the two Lidar types. A third test type was applied to check the density achieved on a traffic board sign located at distance of five meters from the trajectory, which increased as the car turned to the left road in the driving scenario (Figure 12). The test checked the effectiveness of cylindrical posts detection at the different tilting angles for the two Lidar types.

**Figure 12.** Traffic sign testing. **Figure 12.** Traffic sign testing.

.

#### **4. Results** In this section, the performance evaluation of the two Lidar types OS-1-64 and Pandar64 will be **4. Results 4. Results**

*4.1. Simulation Results of Pandar64*

*4.1. Simulation Results of Pandar64*

the produced point clouds data and the mapping vehicle speed.

increases on ground features whenever the tilting angle increases.

shown by a simulated drive with an MMS equipped with one Lidar device each time. The simulations are applied using the same trajectory and a constant vehicle speed of 36 km/h. It should be noted that we did not apply simulations at different driving speeds because of the linear dependency between the produced point clouds data and the mapping vehicle speed. In this section, the performance evaluation of the two Lidar types OS-1-64 and Pandar64 will be shown by a simulated drive with an MMS equipped with one Lidar device each time. The simulations are applied using the same trajectory and a constant vehicle speed of 36 km/h. It should be noted that we did not apply simulations at different driving speeds because of the linear dependency between In this section, the performance evaluation of the two Lidar types OS-1-64 and Pandar64 will be shown by a simulated drive with an MMS equipped with one Lidar device each time. The simulations are applied using the same trajectory and a constant vehicle speed of 36 km/h. It should be noted that we did not apply simulations at different driving speeds because of the linear dependency between the produced point clouds data and the mapping vehicle speed.

(**a**) (**b**)

(**a**) (**b**)

For every tilting angle of 30°, 35°, 40°, 45°, and 50°, the point cloud density was computed on

For every tilting angle of 30°, 35°, 40°, 45°, and 50°, the point cloud density was computed on both the ground and off ground. The results are represented as shown in Figure 13, where the density **4. Results**

#### *4.1. Simulation Results of Pandar64 4.1. Simulation Results of Pandar64*

the difference is within 0.25%.

the produced point clouds data and the mapping vehicle speed.

different tilting angles for the two Lidar types.

**road sign board**

For every tilting angle of 30◦ , 35◦ , 40◦ , 45◦ , and 50◦ , the point cloud density was computed on both the ground and off ground. The results are represented as shown in Figure 13, where the density increases on ground features whenever the tilting angle increases. For every tilting angle of 30°, 35°, 40°, 45°, and 50°, the point cloud density was computed on both the ground and off ground. The results are represented as shown in Figure 13, where the density increases on ground features whenever the tilting angle increases.

**Figure 12.** Traffic sign testing.

In this section, the performance evaluation of the two Lidar types OS-1-64 and Pandar64 will be shown by a simulated drive with an MMS equipped with one Lidar device each time. The simulations are applied using the same trajectory and a constant vehicle speed of 36 km/h. It should be noted that

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 10 of 20

(**a**) (**b**) **Figure 11.** Selected testing point cloud slices in blue. (**a**) Ground slice. (**b**) Facade slice**.**

A third test type was applied to check the density achieved on a traffic board sign located at distance of five meters from the trajectory, which increased as the car turned to the left road in the driving scenario (Figure 12). The test checked the effectiveness of cylindrical posts detection at the

.

**Figure 13.** Pandar64 point cloud density on the ground and off ground features and their achieved coverage percentage. (**a**) Point cloud @30◦ . (**b**) Graphical representation of the coverage and density achieved. coverage percentage. (**a**) Point cloud @30˚. (**b**) Graphical representation of the coverage and density achieved.

The density check on the selected slices of Figure 11 was also evaluated and represented in Figure 14. It should be noted that the façade slice was 5 m away from the mapping trajectory. The density check on the selected slices of Figure11 was also evaluated and represented in Figure14. It should be noted that the façade slice was 5 m away from the mapping trajectory.

**Figure 14.** Pandar64 density checks on selected point clouds slices. (**a**) Density computed on the ground section. (**b**) Density computed on the facade section. **Figure 14.** Pandar64 density checks on selected point clouds slices. (**a**) Density computed on the ground section. (**b**) Density computed on the facade section.

It should be noted that the 0 value of the x-axis indicates the scanning trajectory on the ground section in Figure 14a, while the start elevation of the façade section is shown in Figure 14b. For facades, higher densities are achieved up to a height of 4 m, and then drops quickly whenever the elevation increased. While the high scanning density on the ground (≈2000 pts/m<sup>2</sup> ) is achieved within 8 m road width. It should be noted that the 0 value of the x-axis indicates the scanning trajectory on the ground section in Figure 14a, while the start elevation of the façade section is shown in Figure 14b. For facades, higher densities are achieved up to a height of 4 m, and then drops quickly whenever the elevation increased. While the high scanning density on the ground (≈2000 pts/m<sup>2</sup> ) is achieved within 8 m road width.

A large produced amount of data is shown in Figures 13 and 14 in terms of the achieved density of the point clouds at every tilting angle. This amount of density, if the achieved accuracy is not counted, is highly suitable for engineering surveys, asset management inventory, roadway condition assessment, powerline clearance, 3D designs, as built surveys, and other general mapping applications, as indicated in [16]. A large produced amount of data is shown in Figures 13 and 14 in terms of the achieved density of the point clouds at every tilting angle. This amount of density, if the achieved accuracy is not counted, is highly suitable for engineering surveys, asset management inventory, roadway condition assessment, powerline clearance, 3D designs, as built surveys, and other general mapping applications, as indicated in [16].

Coverage percentage was evaluated by comparing the resulted point cloud at each tilt angle to

chart.

deviations at both testing slices are listed in Table 2.

8 m road width.

achieved.

Coverage percentage was evaluated by comparing the resulted point cloud at each tilt angle to the reference model (Figure 15). The coverage was found to be slightly fluctuating as shown in Figure 13, where it starts at a high percentage of 30◦ , decreases at 40◦ , and then improves at 45◦ . However, the difference is within 0.25%. applications, as indicated in [16]. Coverage percentage was evaluated by comparing the resulted point cloud at each tilt angle to the reference model (Figure 15). The coverage was found to be slightly fluctuating as shown in Figure 13, where it starts at a high percentage of 30°, decreases at 40°, and then improves at 45°. However, the difference is within 0.25%.

assessment, powerline clearance, 3D designs, as built surveys, and other general mapping

(**a**) (**b**) **Figure 14.** Pandar64 density checks on selected point clouds slices. (**a**) Density computed on the

It should be noted that the 0 value of the x-axis indicates the scanning trajectory on the ground section in Figure 14a, while the start elevation of the façade section is shown in Figure 14b. For facades, higher densities are achieved up to a height of 4 m, and then drops quickly whenever the

A large produced amount of data is shown in Figures 13 and 14 in terms of the achieved density of the point clouds at every tilting angle. This amount of density, if the achieved accuracy is not

) is achieved within

ground section. (**b**) Density computed on the facade section.

elevation increased. While the high scanning density on the ground (≈2000 pts/m<sup>2</sup>

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 11 of 20

**Figure 13.** Pandar64 point cloud density on the ground and off ground features and their achieved coverage percentage. (**a**) Point cloud @30˚. (**b**) Graphical representation of the coverage and density

Figure 14. It should be noted that the façade slice was 5 m away from the mapping trajectory.

The density check on the selected slices of Figure 11 was also evaluated and represented in

**Figure 15.** Illustration of point cloud coverage amount with respect to the reference model. **Figure 15.** Illustration of point cloud coverage amount with respect to the reference model.

As mentioned earlier, the third parameter for the performance evaluation is the accuracy in the positions of the scanned points. Relative accuracy computed on both the ground and off ground features, and was found to improve whenever the tilting angle increased, as shown in Figure 16a. Furthermore, to illustrate the predicted accuracy at every tilting angle, a cumulative distribution function CDF chart [21] is plotted for the whole point clouds in Figure 16b. As mentioned earlier, the third parameter for the performance evaluation is the accuracy in the positions of the scanned points. Relative accuracy computed on both the ground and off ground features, and was found to improve whenever the tilting angle increased, as shown in Figure 16a. Furthermore, to illustrate the predicted accuracy at every tilting angle, a cumulative distribution function CDF chart [21] is plotted for the whole point clouds in Figure 16b.

**Figure 16.** (**a**) Histogram illustrating the achieved relative accuracy using Pandar64. (**b**) CDF accuracy **Figure 16.** (**a**) Histogram illustrating the achieved relative accuracy using Pandar64. (**b**) CDF accuracy chart.

The CDF chart shows the accuracy of the point cloud at each tilting angle using Equation (4). A better accuracy is achieved at larger tilting angles, which is logical since the incident angles of the The CDF chart shows the accuracy of the point cloud at each tilting angle using Equation (4). A better accuracy is achieved at larger tilting angles, which is logical since the incident angles of the laser beams are getting closer to right angles at the road surface and facades at the trajectory sides.

laser beams are getting closer to right angles at the road surface and facades at the trajectory sides. The estimated accuracy at 50° and 30° Lidar tilt angles is illustrated in Figure 17a,b, respectively, where every point with ≥5 cm standard deviation in position is colored red. For a better illustration, the point clouds were also tested for both selected slices of Figure 11 on the ground and on the façade. Figure 17c,d show the estimated accuracy at every Lidar tilting angle for both testing slices, where the points that have a standard deviation ≥5 cm are colored red. The summarized average standard The estimated accuracy at 50◦ and 30◦ Lidar tilt angles is illustrated in Figure 17a,b, respectively, where every point with ≥5 cm standard deviation in position is colored red. For a better illustration, the point clouds were also tested for both selected slices of Figure 11 on the ground and on the façade. Figure 17c,d show the estimated accuracy at every Lidar tilting angle for both testing slices, where the points that have a standard deviation ≥5 cm are colored red. The summarized average standard deviations at both testing slices are listed in Table 2.

**Lidar Tilt Angle Façade Slice Ground Slice** 50˚ 3.75 ± 0.95 3.82 ± 0.42 45˚ 3.76 ± 1.03 3.87 ± 0.46 40˚ 3.76 ± 1.06 3.93 ± 0.51 35˚ 3.75 ± 1.08 4.01 ± 0.58 30˚ 3.76 ± 1.13 4.12 ± 0.68

**Average Standard Deviation [cm]**

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 13 of 20

**Figure 17.** Accuracy labeled point clouds scanned by Pandar64 (red ≥ 5 cm) (**a**) Accuracy at 50° Lidar tilt angle. (**b**) Accuracy at 30° Lidar tilt angle. (**c**) Accuracy estimated at the ground slice for every tilting angle. (**d**) Accuracy estimated at the façade slice for every tilting angle. **Figure 17.** Accuracy labeled point clouds scanned by Pandar64 (red σ ≥ 5 cm) (**a**) Accuracy at 50◦ Lidar tilt angle. (**b**) Accuracy at 30◦ Lidar tilt angle. (**c**) Accuracy estimated at the ground slice for every tilting angle. (**d**) Accuracy estimated at the façade slice for every tilting angle.


The plots show a higher relative accuracy at the 50° angle compared to the 30° angle. This **Table 2.** The estimated average standard deviation on the ground and facade slices using Pandar64.

The plots show a higher relative accuracy at the 50◦ angle compared to the 30◦ angle. This difference in accuracy is clearly related to the incidence angle of the scanning beams as mentioned.

(**a**) (**b**) (**c**) Figure 18 illustrates examples for the incidence angle magnitudes achieved on the ground at three different Lidar tilting angles of 15◦ , 30◦ , and 50◦ . The figure illustrates how the increase of the laser beams incidence angles on the ground is related to the increase of the Lidar tilt angle.

**Figure 18.** The relation between the Pandar64 Lidar tilt angle and the incidence angle where red points having ≥20° incidence angle on the ground. (**a**) @ 15°tilting angle: the average angle of incidence ≈14° 5°. (**b**) @ 30° tilting angle: the average angle of incidence ≈23° 9°. (**c**) @ 50°tilting angle: the average

angle of incidence ≈44° 22°.

@50°

@45°

@40°

@35°

@30°

(**c**) (**d**) **Figure 17.** Accuracy labeled point clouds scanned by Pandar64 (red ≥ 5 cm) (**a**) Accuracy at 50° Lidar tilt angle. (**b**) Accuracy at 30° Lidar tilt angle. (**c**) Accuracy estimated at the ground slice for

Driving path Bottom

The plots show a higher relative accuracy at the 50° angle compared to the 30° angle. This difference in accuracy is clearly related to the incidence angle of the scanning beams as mentioned. Figure 18 illustrates examples for the incidence angle magnitudes achieved on the ground at

every tilting angle. (**d**) Accuracy estimated at the façade slice for every tilting angle.

laser beams incidence angles on the ground is related to the increase of the Lidar tilt angle.

(**a**) (**b**)

50° 45° 40° 35° 30°

Top

**Figure 18.** The relation between the Pandar64 Lidar tilt angle and the incidence angle where red points having ≥20° incidence angle on the ground. (**a**) @ 15°tilting angle: the average angle of incidence ≈14° 5°. (**b**) @ 30° tilting angle: the average angle of incidence ≈23° 9°. (**c**) @ 50°tilting angle: the average angle of incidence ≈44° 22°. **Figure 18.** The relation between the Pandar64 Lidar tilt angle and the incidence angle where red pointshaving <sup>≥</sup>20◦ incidence angle on the ground. (**a**) @ 15◦ tilting angle: the average angle of incidence ≈14◦ ± 5 ◦ . (**b**) @ 30◦ tilting angle: the average angle of incidence≈23◦ ± 9 ◦ . (**c**) @ 50◦ tilting angle: the average angle of incidence ≈44◦ ± 22◦ . *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 14 of 20

The third test as mentioned is to test the scanning goodness of a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 838 pts/m<sup>2</sup> at 30◦ tilt angle to 507 pts/m<sup>2</sup> at 50◦ tilting angle (Figure 19a). CloudCompare Ransac shape detection plugin is used to detect the sign cylindrical legs geometric primitives (Figure 19b). Accordingly, the deviation of the two detected cylindrical legs radii is computed with respect to the reference model radius (Figure 19c). The third test as mentioned is to test the scanning goodness of a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 838 pts/m<sup>2</sup> at 30° tilt angle to 507 pts/m<sup>2</sup> at 50° tilting angle (Figure 19a). CloudCompare Ransac shape detection plugin is used to detect the sign cylindrical legs geometric primitives (Figure 19b). Accordingly, the deviation of the two detected cylindrical legs radii is computed with respect to the reference model radius (Figure 19c).

**Figure 19.** (**a**) Density of points at the road board sign at each tilting angle (red ≥ 1200 pts/m<sup>2</sup> ). (**b**) An example of the detected primitives of the board sign. (**c**) Histogram of differences in the estimated cylindrical board legs radii. **Figure 19.** (**a**) Density of points at the road board sign at each tilting angle (red <sup>≥</sup> 1200 pts/m<sup>2</sup> ). (**b**) An example of the detected primitives of the board sign. (**c**) Histogram of differences in the estimated cylindrical board legs radii.

Assuming good set up parameters were entered for the cylinder fitting algorithm, we can say that, except for the point cloud produced at 30° tilt, the cylinder fitting at the other angles produced extra small cylinders as a false positive detection. Assuming good set up parameters were entered for the cylinder fitting algorithm, we can say that, except for the point cloud produced at 30◦ tilt, the cylinder fitting at the other angles produced extra small cylinders as a false positive detection.

#### *4.2. Simulation Results of OS-1-64 4.2. Simulation Results of OS-1-64*

achieved.

Similar to the tests applied for Pandar64 in Section 4.1, density was computed for the point clouds produced from OS-1-64 for every tilting angle at 30°, 35°, 40°, 45°, and 50°. The point cloud Similar to the tests applied for Pandar64 in Section 4.1, density was computed for the point clouds produced from OS-1-64 for every tilting angle at 30◦ , 35◦ , 40◦ , 45◦ , and 50◦ . The point cloud density

density was computed on both the ground and off ground. The results are shown in Figure 20, where

the density increases on ground features whenever the tilting angle increases.

(**a**) (**b**)

**Figure 20.** Point cloud density of OS-1-64 on the ground and off ground features and their achieved coverage percentage. (**a**) Point cloud @30˚. (**b**) Graphical representation of the coverage and density cylindrical board legs radii.

extra small cylinders as a false positive detection.

to the reference model radius (Figure 19c).

@50°

@45°

@40°

@35°

@30°

was computed on both the ground and off ground. The results are shown in Figure 20, where the density increases on ground features whenever the tilting angle increases. clouds produced from OS-1-64 for every tilting angle at 30°, 35°, 40°, 45°, and 50°. The point cloud density was computed on both the ground and off ground. The results are shown in Figure 20, where the density increases on ground features whenever the tilting angle increases.

(**a**) (**b**) (**c**) **Figure 19.** (**a**) Density of points at the road board sign at each tilting angle (red ≥ 1200 pts/m<sup>2</sup>

example of the detected primitives of the board sign. (**c**) Histogram of differences in the estimated

Assuming good set up parameters were entered for the cylinder fitting algorithm, we can say that, except for the point cloud produced at 30° tilt, the cylinder fitting at the other angles produced

Similar to the tests applied for Pandar64 in Section 4.1, density was computed for the point

). (**b**) An

*Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 14 of 20

The third test as mentioned is to test the scanning goodness of a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 838 pts/m<sup>2</sup> at 30° tilt angle to 507 pts/m<sup>2</sup> at 50° tilting angle (Figure 19a). CloudCompare Ransac shape detection plugin is used to detect the sign cylindrical legs geometric primitives (Figure 19b). Accordingly, the deviation of the two detected cylindrical legs radii is computed with respect

**Figure 20.** Point cloud density of OS-1-64 on the ground and off ground features and their achieved coverage percentage. (**a**) Point cloud @30˚. (**b**) Graphical representation of the coverage and density **Figure 20.** Point cloud density of OS-1-64 on the ground and off ground features and their achieved coverage percentage. (**a**) Point cloud @30◦ . (**b**) Graphical representation of the coverage and density achieved. *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 15 of 20

achieved. The density check on the selected slices of Figure 11 is also evaluated and represented in Figure 21. It should be noted that a 0 value on the x-axis indicates the scanning trajectory on the ground section in Figure 21a, while the start elevation on the façade section is shown in Figure 21b. The density check on the selected slices of Figure 11 is also evaluated and represented in Figure 21. It should be noted that a 0 value on the x-axis indicates the scanning trajectory on the ground section in Figure 21a, while the start elevation on the façade section is shown in Figure 21b.

**Figure 21.** OS-1-64 density checks on selected point clouds slices. (**a**) Density computed on the ground section. (**b**) Density computed on the 5 m distant facade section. **Figure 21.** OS-1-64 density checks on selected point clouds slices. (**a**) Density computed on the ground section. (**b**) Density computed on the 5 m distant facade section.

Figures 20 and 21 indicate a large produced amount of data in terms of the achieved density of the point clouds at every tilting angle. Figures 20 and 21 indicate a large produced amount of data in terms of the achieved density of the point clouds at every tilting angle.

Furthermore, coverage percentage is evaluated by comparing the resulted point cloud at each tilt angle to the reference model. The coverage is found to be decreasing as shown in Figure 20 whenever the tiling angle increases. Furthermore, coverage percentage is evaluated by comparing the resulted point cloud at each tilt angle to the reference model. The coverage is found to be decreasing as shown in Figure 20 whenever the tiling angle increases.

The performance evaluation in terms of accuracy in the positions of the scanned points was also applied for the point clouds produced by the OS-1-64. Similar to the Pandar64 case, relative accuracy computed on both ground and off ground features was found to be improving whenever the tilting angle increased, as shown in Figure 22a. Furthermore, to illustrate the accuracy predicted at every tilting angle, a CDF chart was plotted for the whole point clouds in Figure 22b. The performance evaluation in terms of accuracy in the positions of the scanned points was also applied for the point clouds produced by the OS-1-64. Similar to the Pandar64 case, relative accuracy computed on both ground and off ground features was found to be improving whenever the tilting angle increased, as shown in Figure 22a. Furthermore, to illustrate the accuracy predicted at every tilting angle, a CDF chart was plotted for the whole point clouds in Figure 22b.

(**a**) (**b**) **Figure 22.** (**a**) Histogram illustrating the achieved relative accuracy using OS-1-64. (**b**) CDF accuracy

chart.

chart.

The density check on the selected slices of Figure 11 is also evaluated and represented in Figure 21. It should be noted that a 0 value on the x-axis indicates the scanning trajectory on the ground

(**a**) (**b**) **Figure 21.** OS-1-64 density checks on selected point clouds slices. (**a**) Density computed on the ground

Figures 20 and 21 indicate a large produced amount of data in terms of the achieved density of

Furthermore, coverage percentage is evaluated by comparing the resulted point cloud at each tilt angle to the reference model. The coverage is found to be decreasing as shown in Figure 20

The performance evaluation in terms of accuracy in the positions of the scanned points was also applied for the point clouds produced by the OS-1-64. Similar to the Pandar64 case, relative accuracy computed on both ground and off ground features was found to be improving whenever the tilting

section. (**b**) Density computed on the 5 m distant facade section.

the point clouds at every tilting angle.

whenever the tiling angle increases.

section in Figure 21a, while the start elevation on the façade section is shown in Figure 21b.

**Figure 22.** (**a**) Histogram illustrating the achieved relative accuracy using OS-1-64. (**b**) CDF accuracy **Figure 22.** (**a**) Histogram illustrating the achieved relative accuracy using OS-1-64. (**b**) CDF accuracy chart. **Table 3.** The estimated average standard deviation on the ground and facade slices using OS-1-64.

The CDF chart shows the accuracy of the point cloud at each tilting angle using the error propagation described in Section 2.2. The estimated accuracy at 50◦ and 30◦ Lidar tilt angles is illustrated in Figure 23a,b respectively, where every point with ≥5 cm standard deviation in position is colored red. For a better illustration, the point clouds were also tested for both selected slices of Figure 11 on the ground and on the façade. Figure 23c,d show the estimated accuracy at every Lidar tilting angle for both testing slices. The summarized average standard deviations at both testing slices are listed in Table 3. **Average Standard Deviation [cm] Tilt Angle Façade Slice Ground Slice** 50˚ 3.85 ± 1.62 3.91 ± 0.86 45˚ 3.87 ± 1.75 3.99 ± 0.97 40˚ 3.88 ± 1.85 4.08 ± 1.09 35˚ 3.91 ± 1.94 4.20 ± 1.30 30˚ 3.95 ± 2.08 4.37 ± 1.53

**Figure 23.** Accuracy labeled point clouds scanned by OS-1-64 (red ≥ 5 cm). (**a**) Accuracy at 50° Lidar tilt angle. (**b**) Accuracy at 30° Lidar tilt angle. (**c**) Accuracy estimated at the ground slice for every tilting angle. (**d**) Accuracy estimated at the façade slice for every tilting angle. **Figure 23.** Accuracy labeled point clouds scanned by OS-1-64 (red σ ≥ 5 cm). (**a**) Accuracy at 50◦ Lidar tilt angle. (**b**) Accuracy at 30◦ Lidar tilt angle. (**c**) Accuracy estimated at the ground slice for every tilting angle. (**d**) Accuracy estimated at the façade slice for every tilting angle.

This difference in accuracy at each Lidar tilt angle is clearly related to the incidence angle of the

**5. Discussion** 

**5. Discussion** 

analyzed in terms of density of points pts/m<sup>2</sup>

analyzed in terms of density of points pts/m<sup>2</sup>

is more suitable in the sense of less fitting errors and detection.


**Table 3.** The estimated average standard deviation on the ground and facade slices using OS-1-64.

This difference in accuracy at each Lidar tilt angle is clearly related to the incidence angle of the scanning beams. Figure 24 illustrates examples for the incidence angle magnitudes achieved on the ground at three different Lidar tilting angles of 15◦ , 30◦ , and 50◦ . *Remote Sens.* **2020**, *12*, x FOR PEER REVIEW 17 of 20

**Figure 24.** The relation between the OS-1-64 Lidar tilt angle and the incidence angle where red points having >20° incidence angle on the ground. (**a**) @ 15°tilting angle: the average angle of incidence ≈12° 8°. (**b**) @ 30° tilting angle: the average angle of incidence ≈20° 11°. (**c**) @ 50°tilting angle: the average angle of incidence ≈30° 16°. **Figure 24.** The relation between the OS-1-64 Lidar tilt angle and the incidence angle where red points having >20◦ incidence angle on the ground. (**a**) @ 15◦ tilting angle: the average angle of incidence ≈12◦ ± 8 ◦ . (**b**) @ 30◦ tilting angle: the average angle of incidence ≈20◦ ± 11◦ . (**c**) @ 50◦ tilting angle: the average angle of incidence ≈30◦ ± 16◦ . (**a**) (**b**) (**c**) **Figure 24.** The relation between the OS-1-64 Lidar tilt angle and the incidence angle where red points having >20° incidence angle on the ground. (**a**) @ 15°tilting angle: the average angle of incidence ≈12° 8°. (**b**) @ 30° tilting angle: the average angle of incidence ≈20° 11°. (**c**) @ 50°tilting angle: the average angle of incidence ≈30° 16°.

As explained, the third test is to evaluate the scanning efficiency on a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 848 pts/m<sup>2</sup> at 30° tilt angle to 539 pts/m<sup>2</sup> at 50° tilting angle (Figure 25a). The deviation of the two detected cylindrical legs radii is computed with respect to the reference model radius and shown in Figure 25b. As explained, the third test is to evaluate the scanning efficiency on a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 848 pts/m<sup>2</sup> at 30◦ tilt angle to 539 pts/m<sup>2</sup> at 50◦ tilting angle (Figure 25a). The deviation of the two detected cylindrical legs radii is computed with respect to the reference model radius and shown in Figure 25b. As explained, the third test is to evaluate the scanning efficiency on a traffic sign at a road corner (Figure 12). Traffic sign test shows that the average number of scan points is decreasing from an average of 848 pts/m<sup>2</sup> at 30° tilt angle to 539 pts/m<sup>2</sup> at 50° tilting angle (Figure 25a). The deviation of the two detected cylindrical legs radii is computed with respect to the reference model radius and shown in Figure 25b.

Histogram of differences in the estimated cylindrical board legs radii. Similar to the observation mentioned using Pandar64, the point cloud produced at 30° tilt angle **Figure 25.** (**a**) Density of points at the road board sign at each tilting angle (red≥1200 pts/m<sup>2</sup> ). (**b**) Histogram of differences in the estimated cylindrical board legs radii. **Figure 25.** (**a**) Density of points at the road board sign at each tilting angle (red <sup>≥</sup> 1200 pts/m<sup>2</sup> ). (**b**) Histogram of differences in the estimated cylindrical board legs radii.

). (**b**)

The experimental results and analysis of the two Lidar types of Hesai Pandar64 and Ouster OS-1-64 are shown in Sections 4.1 and 4.2, respectively. The results are based on a 3D simulation scanning applied at different angular orientations (30° to 50°) for both types. The produced point clouds are

The experimental results and analysis of the two Lidar types of Hesai Pandar64 and Ouster OS-1-64 are shown in Sections 4.1 and 4.2, respectively. The results are based on a 3D simulation scanning applied at different angular orientations (30° to 50°) for both types. The produced point clouds are

, coverage, and accuracy.

, coverage, and accuracy.

Similar to the observation mentioned using Pandar64, the point cloud produced at 30◦ tilt angle is more suitable in the sense of less fitting errors and detection.
