Next Article in Journal
Enhancing Facial Expression Recognition through Light Field Cameras
Previous Article in Journal
Measurement Techniques for Low-Concentration Tritium Radiation in Water: Review and Prospects
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Algorithm for Generating Outdoor Floor Plans and 3D Models of Rural Houses Based on Backpack LiDAR

1
School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454003, China
2
Shanghai Huace Navigation Technology Ltd., Shanghai 201700, China
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(17), 5723; https://doi.org/10.3390/s24175723
Submission received: 25 July 2024 / Revised: 11 August 2024 / Accepted: 20 August 2024 / Published: 3 September 2024
(This article belongs to the Section Radar Sensors)

Abstract

:
As the Rural Revitalization Strategy continues to progress, there is an increasing demand for the digitization of rural houses, roads, and roadside trees. Given the characteristics of rural areas, such as narrow roads, high building density, and low-rise buildings, the precise and automated generation of outdoor floor plans and 3D models for rural areas is the core research issue of this paper. The specific research content is as follows: Using the point cloud data of the outer walls of rural houses collected by backpack LiDAR as the data source, this paper proposes an algorithm for drawing outdoor floor plans based on the topological relationship of sliced and rasterized wall point clouds. This algorithm aims to meet the needs of periodically updating large-scale rural house floor plans. By comparing the coordinates of house corner points measured with RTK, it is verified that the floor plans drawn by this algorithm can meet the accuracy requirements of 1:1000 topographic maps. Additionally, based on the generated outdoor floor plans, this paper proposes an algorithm for quickly generating outdoor 3D models of rural houses using the height information of wall point clouds. This algorithm can quickly generate outdoor 3D models of rural houses by longitudinally stretching the floor plans, meeting the requirements for 3D models in spatial analyses such as lighting and inundation. By measuring the distance from the wall point clouds to the 3D models and conducting statistical analysis, results show that the distances are concentrated between −0.1 m and 0.1 m. The 3D model generated by the method proposed in this paper can be used as one of the basic data for real 3D construction.

1. Introduction

The traditional steps for generating indoor and outdoor floor plans of buildings typically include several stages: First, instruments such as total stations and RTK are used to collect the coordinates of the corners of the house walls, or tools like rangefinders and tape measures are employed to measure the dimensions of walls, doors, and windows. Then, the data is manually drawn based on field sketches. This method is not only labor-intensive but also prone to data omissions. In recent years, with the development of LiDAR technology [1,2,3], new methods have gradually been proposed. These methods first use LiDAR to collect point cloud data of the building’s interior and exterior, and then analyze and process the point cloud data to ultimately achieve the automated generation of indoor and outdoor floor plans. These methods have become a research hotspot [4,5,6,7]. Many scholars have conducted in-depth studies on this issue and proposed various interrelated yet distinct methods.
Based on relevant literature, the steps for generating building floor plans using geometric and mathematical methods are roughly as follows [8,9,10,11,12,13]: first, extract the point cloud data of the building walls, then project these wall point clouds onto a horizontal plane, and finally, draw the floor plan from the projected point clouds. The methods for extracting wall point clouds involve directly extracting point clouds from the middle of the building. This is done by selecting an appropriate height range based on the building structure and extracting the point cloud data from the middle, as shown in Figure 1a; Statistical Histogram Based on Building Elevation Information. Figure 1b shows a histogram of building elevation information generated from indoor point cloud data collected by backpack LiDAR. The indoor data includes point clouds of the ground and roof, better reflecting the uniformity of point cloud distribution by using the Random Sample Consensus (RANSAC) algorithm to extract wall point clouds, and identifying and extract wall point clouds through the RANSAC algorithm. For the extracted wall point clouds, the methods for drawing floor plans can be roughly categorized as follows [14,15]: Direct Projection and Resampling Method: Directly project the wall point clouds onto a horizontal plane and resample them, using the vector angles and distances of the point clouds to complete the floor plan drawing; Depth Image Conversion Method: Convert the wall point clouds into a depth image on the horizontal plane and analyze each line in the depth image to detect the actual wall positions; Polygon Partitioning and Energy Minimization Method: Project the wall point clouds onto a horizontal plane to generate multiple polygon partitions, and use an energy minimization algorithm to select the boundaries that belong to the walls, thus completing the floor plan drawing. In recent years, using deep learning to generate building floor plans has become a new research focus [16,17]. For example, the Scan2Plan model uses deep neural networks to cluster unordered point clouds in the initial stage, then predicts the corner points of each room and draw the floor plans, finally stitching together the floor plans of each room for output. Another model, FloorPP-Net, converts building point clouds into point pillars, predicts the corner points and wall positions, and thus generates the floor plans.
Building on the generation of outdoor floor plans, further research can be conducted to extend these into 3D models of buildings [18,19]. Current research focuses on using LiDAR to collect 3D coordinate data and image data of building surfaces. By analyzing this data, 3D models of buildings are generated, and image data is used for coloring, as shown in Figure 2. Many scholars have conducted in-depth research on this issue and proposed various methods. These methods can be broadly classified into automated drawing methods based on spatial geometry, analytic geometry, and deep learning, as well as manual drawing methods using existing 3D modeling software (such as Revit, ArchiCAD or Rhino).
The methods for generating 3D models of buildings based on spatial geometry can be broadly classified into three categories [20,21,22]: Directly Generating 3D Models from Raw Point Cloud Data: For example, using the Delaunay triangulation method to generate 3D models from point clouds of mine tunnels; Based on Feature Points, Feature Lines, or Wireframe Models of Building Walls: Using these features and their topological relationships to draw 3D models; Semantic Clustering of Building Point Clouds: Extracting objects such as walls, doors, and windows, drawing their 3D models separately, and then merging them. The methods for generating 3D models of buildings based on analytic geometry can also be roughly classified into three categories [23,24,25]: Poisson Surface Reconstruction Method: Effectively removing noise points from the point cloud; Non-Uniform Rational B-Splines (NURBS) Method: Constraining point cloud data to complete the 3D model drawing; Partial Differential Equation (PDE) Method: Dividing the point cloud into multiple surface segments expressed by PDEs, then merging these segments to reconstruct the 3D structure of complex surfaces. With advancements in computer hardware capabilities, deep learning-based methods for generating 3D models of buildings have also increased [26,27,28]. For example, Points2Surf Model reconstructs 3D models of objects directly from raw point clouds without normal vector information, using detailed local patches and coarsely estimated global information. Using Point Net++, this model extracts the depth features of each point and cluster them, further determining the precise positions of building roof corner points, thereby generating 3D models from airborne LiDAR data. Currently, the most commonly used method for generating 3D models of buildings involves first using LiDAR equipment to collect point cloud data of the buildings, then importing this data into existing 3D modeling software (such as Revit, ArchiCAD or Rhino), and finally manually creating the models [29,30,31]. This method has been widely applied in fields such as heritage preservation, civil defense engineering, disaster assessment, and substations.
Through an in-depth analysis of current domestic and international research, it is evident that there is limited exploration of automated outdoor floor plan generation, particularly incorporating attributes such as building materials and floor plan area. Rural houses, which are generally low-rise and varied in appearance, often face interference from vehicles, trees, and other obstructions during LiDAR data collection. This increases the difficulty of using point cloud data for the automated generation of outdoor floor plans of rural houses. Additionally, the surveying industry requires periodic updates of large-scale, high-resolution outdoor building floor plans. Therefore, we propose an algorithm that projects the point clouds of rural house exterior walls onto a horizontal plane and draws the outdoor floor plans based on the topological relationships of the wall point clouds. Compared with RTK-measured house corner coordinates, the floor plans generated using backpack LiDAR-collected outdoor point clouds of rural houses meet the accuracy requirements of 1:1000 topographic maps. This algorithm can also add information such as the area, building materials, and number of floors of rural houses to the floor plans. In terms of 3D model generation, traditional spatial and analytic geometry methods are only suitable for small indoor spaces or small objects and are not appropriate for the automated generation of large-scale outdoor 3D models of rural houses. Deep learning methods have high computational hardware requirements and lack sufficient outdoor point cloud datasets of rural houses. Existing software-based drawing methods are suitable for detailed modeling of single buildings but are not conducive to the rapid generation of large-scale simple 3D models. Therefore, we propose an algorithm for the rapid generation of outdoor 3D models based on the outdoor floor plans of rural houses. To meet the needs of 3D models for masking analysis and lighting analysis in rural settings, we extended the algorithm for drawing outdoor floor plans of rural houses. This algorithm uses the height of the walls in the point cloud data to longitudinally stretch the floor plans, quickly generating outdoor 3D models of rural houses.

2. Algorithms for Drawing Outdoor Floor Plans and 3D Models

2.1. Principle of the Outdoor Floor Plan Drawing Algorithm

This paper proposes an algorithm for generating outdoor floor plans based on the topological relationships of wall slices gridded point clouds, using point clouds of rural residential exterior walls collected by backpack LiDAR as the data source. This method is designed to meet the demand for periodically updating large-scale rural residential floor plans. The algorithm first segments the exterior wall point clouds by elevation and projects the segmented wall point clouds horizontally, followed by gridding. Subsequently, the outdoor floor plan is generated based on the topological relationships between wall points and the threshold angles of line segments. Finally, the algorithm calculates the area and the number of floors of the residence. Figure 3 illustrates the flowchart of this algorithm.
  • Preprocessing of outdoor point clouds for rural houses:
The outdoor point cloud data of rural residences collected by backpack LiDAR typically include ground, vehicles, and trees. To better generate outdoor floor plans of rural residences (Figure 4a,b), it is necessary to first classify various point clouds using existing point cloud processing software.
2.
Clipping outdoor wall point clouds based on elevation:
When collecting point clouds of residential walls along rural roads using backpack LiDAR, the process is often obstructed by parked cars, planted fruit trees, and placed trash bins, leading to holes in the wall point clouds. Therefore, to create an outdoor floor plan of rural residences (Figure 4c), wall point cloud data within the height range of 1.5 to 1.7 m above the ground should be extracted.
3.
Rasterized horizontal projection of clipped wall point cloud:
Since the generated outdoor floor plan is independent of elevation, the segmented wall point clouds can be projected onto the horizontal plane. The number of grid cells in the X and Y directions for a specified grid cell size can be calculated using Equation (1), and each point can then be assigned to the corresponding grid cell using Equation (2). Finally, the centroid of all points within each grid cell can be calculated using Equation (3) to eliminate potentially redundant data (Figure 5).
C X = X max X min S grid + 1 C Y = Y max Y min S grid + 1
In the formula: X max , X min , Y max , Y min represent the maximum and minimum values of the X and Y coordinates in the point cloud; S grid represents the size of each grid cell; denotes the floor function; C X and C Y represent the number of grid cells in the X and Y directions.
I x = X i X min S grid I y = Y i Y min S grid
In the formula: X i and Y i represent the coordinates of the i -th point; I x and I y represent the row and column indices of the grid cell where the i -th point is located.
X ¯ = 1 n i = 1 n X i Y ¯ = 1 n i = 1 n Y i
In the formula: X ¯ , Y ¯ represents the centroid coordinates of all points in the current grid cell.
4.
Drawing the floor plan using the topological relationships of wall points:
Due to the lack of topological relationships among the points in the horizontally projected wall point clouds, the following method can be used to restore these relationships for floor plan generation. First, select any point P from the gridded wall point cloud, and perform a K-nearest neighbor search centered on P (sorted in ascending order of distance from P), recording the closest point Q. Next, perform another K-nearest neighbor search centered on Q (sorted in ascending order of distance from Q). If the closest point M to Q has already been recorded, find the nearest unrecorded point N among Q’s K-nearest neighbors. This process is repeated until point P is recorded again, thus restoring the topological relationships among the wall points (Figure 6).
5.
Simplify the floor plan using line segment angle threshold:
The floor plan generated from the gridded wall points often contains numerous redundant vertices at straight-line segments. To simplify the floor plan, the following steps can be taken: Starting from any endpoint in the original floor plan, check the angles between each pair of adjacent line segments. If the angle is close to 180°, remove the common point of the two adjacent line segments and connect them into a new line segment. The condition for stopping the simplification is by generating new line segments clockwise or counterclockwise according to the topological relationship of the initial plan vertices until the initial point is recorded again, thus simplifying the redundant values between wall points (Figure 7).
6.
Calculate the area of the outdoor floor plan:
In mathematics, for any polygon in a Cartesian coordinate system, if the coordinates of each vertex P 1 x 1 , y 1 , P 2 x 2 , y 2 , , P n x n , y n are known, and the vertices are arranged in a clockwise or counter-clockwise order, the area of the polygon can be calculated using the Shoelace theorem (Equation (4)). By substituting the vertex coordinates of the simplified floor plan into Equation (4) in sequence, the area of the floor plan can be obtained.
A = 1 2 i = 1 n x i y i + 1 x i + 1 y i
In the formula: * represents the absolute value; x n + 1 equals x 1 ; y n + 1 equals y 1 .
7.
Calculate the number of floors in the rural house:
From the pre-segmented rural residential wall point cloud, the maximum elevation Z max and minimum elevation Z min of the walls can be obtained. Given the specified height of each floor in the rural residence, the number of floors can be calculated using Equation (5).
H h = Z max Z min h
In the formula: H h represents the number of floors in the rural house; the height of each floor of the specified rural residence is denoted by “h”, with [ ] representing the floor function (rounding down).
8.
Add attribute information to the outdoor floor plan;
In mathematics, for any polygon in a Cartesian coordinate system, if the coordinates of each vertex P 1 x 1 , y 1 , P 2 x 2 , y 2 , , P n x n , y n are known, and the vertices are arranged in a clockwise or counter-clockwise order, the centroid coordinates of the polygon can be calculated using Equation (6). By substituting the vertex coordinates of the simplified floor plan into Equation (6) in sequence, the centroid position of the floor plan can be determined. The area of the floor plan, along with information about the building material structure and number of floors, can be added to the centroid position (Figure 8). Material information data is recorded during on-site collection and manually imported as attribute information before running the algorithm, after which it is automatically loaded into the floor plan.
x = 1 6 A i = 1 n x i + x i + 1 x i y i + 1 x i + 1 y i y = 1 6 A i = 1 n y i + y i + 1 x i y i + 1 x i + 1 y i
In the formula: A represents the area of the polygon; x n + 1 equals x 1 ; y n + 1 equals y 1 .
9.
Add elevation annotations to the floor plan:
In step (1) of the point cloud preprocessing, the ground points classified can be used to add several elevation annotation points around the outdoor floor plan (Figure 9). By randomly selecting a point P from the outdoor ground points collected by the backpack LiDAR, a radius search can be performed using P as the center and R as the radius to find its neighboring points. Using the coordinates of point P and its neighboring points, the plane position and elevation annotation points can be fitted using Equation (7).
x ¯ = 1 k i k x i y ¯ = 1 k i k y i z ¯ = 1 k i k z i

2.2. Principle of the Outdoor 3D Model Drawing Algorithm

This paper proposes an algorithm for quickly generating a 3D model of rural residential exteriors based on wall point cloud heights and outdoor floor plans, using point clouds of rural residential exterior walls collected by backpack LiDAR as the data source. This algorithm is designed to meet the needs of spatial analyses such as illumination and inundation. The algorithm first uses the exterior wall point clouds collected by backpack LiDAR to draw a floor plan. Then, it creates control points for the 3D model based on the wall heights and the horizontal positions of vertices in the floor plan. Finally, the exterior 3D model’s walls and roof are drawn sequentially according to the topological relationships of the vertices in the floor plan. Figure 9 shows the workflow of the algorithm for drawing the 3D model of a rural residence.
  • Drawing the outdoor floor plan of rural houses:
Since the 3D model of the rural residence exterior is based on the outdoor floor plan, the specific principles will not be described in detail here; please refer to Section 2.1 for more information.
2.
Creating control points for the 3D model of rural houses:
From the rural residential exterior wall point clouds, the maximum and minimum wall heights h max and h min can easily be obtained. Then, the planar X and Y coordinates of each vertex in the floor plan are taken, and the maximum and minimum wall heights are assigned as the elevations of these vertices, respectively, to create control points for the 3D model of the rural residence’s exterior (Figure 10).
3.
Drawing the walls of the outdoor 3D model of rural houses:
To draw the exterior 3D model’s walls according to the sequence of vertices in the floor plan, follow these steps. Step 1: Select any bottom point P as the starting point. Connect point P with its corresponding top point Q, then connect point Q with the adjacent bottom point M, and finally connect the bottom point M back to point P, completing the first triangular face. Step 2: Use point M as the starting point. Connect point M with its corresponding top point N, then connect point N with the adjacent top point Q, and finally connect the top point Q back to point M, completing the second triangular face. Repeat this process to complete all triangular faces (Figure 11).
4.
Drawing the roof of the outdoor 3D model of rural houses:
To draw the exterior 3D model’s roof according to the sequence of vertices in the floor plan, follow these steps. Step 1: Select any top point P as the starting point for all triangular faces on the roof. Connect point P with its adjacent top point Q, then connect point Q with its adjacent top point M, and finally connect the top point M back to point P, completing the first triangular face. Step 2: Connect point P with point M, then connect point M with its adjacent top point N, and finally connect the top point N back to point P, completing the second triangular face. Repeat this process to complete all triangular faces (Figure 12).
5.
Output the 3D model:
The three-dimensional model was exported in PLY file format. Figure 13 shows the outdoor 3D model of the rural residential area.

3. Data Collection

3.1. Use of Instruments

The backpack LiDAR (GEO-VISIOM, Beijing, China) used in this study is the SSW-QS lightweight indoor and outdoor measurement system with LiDAR SLAM technology (Figure 14). This instrument includes a panoramic camera, RTK, dual LiDAR, and a control system [32,33,34]. After LiDAR point cloud processing and resampling, the point cloud density is 1–3 cm. The instrument’s absolute accuracy is ≤10 cm, and relative accuracy is ≤3 cm. The point cloud has geographic references, using RTK-SLAM technology, with the coordinate system being CGCS2000, 3° zone.
The parameters of backpack laser SLAM system are shown in Table 1.

3.2. Study Area

This study focuses on a village near the suburbs (Figure 15). Through field surveys and satellite imagery, it was found that the village covers an area of about 100,000 square meters. The houses are arranged in two rows: one facing north and the other facing south. The residential buildings are typical self-built houses characteristic of rural northern China’s housing.

3.3. Operational Procedure

Based on the distribution of the village houses, the data collection strategy involved segmenting the outdoor roads and collecting point cloud data. To ensure data completeness and accuracy, the collection route was designed to form closed loops whenever possible and maintain symmetrical scanning in complex scenes. Starting from the center of the survey area, an 8-shaped route was adopted for circular closed-loop scanning. Figure 16 shows the walking route during data collection, automatically generated using the backpack LiDAR. Figure 17 shows the outdoor point cloud data of rural houses collected using the backpack LiDAR.

4. Accuracy Analysis

4.1. Parameter Optimization of the Outdoor Floor Plan Drawing Algorithm

In the outdoor floor plan drawing algorithm for rural houses designed in Section 2.1, parameters such as the height range for wall point cloud extraction, the grid size for wall point cloud rasterization, and the angle threshold for floor plan simplification are considered. Meanwhile, the 3D model algorithm for rural houses outlined in Section 2.2, as an extension of the floor plan drawing algorithm, does not require separate optimization of key parameters. Subsequently, this study will optimize the parameters needed for the outdoor floor plan drawing algorithm to ensure the best possible results in terms of the appearance and accuracy of the generated plans.
  • Height range for clipping wall point clouds:
To minimize point cloud voids caused by obstructions like vehicles, garbage bins, and air conditioning units, we selected four different height ranges for extraction: 1.3~1.5 m, 1.5~1.7 m, 1.7~1.9 m, and 1.9~2.1 m (Figure 18). Within the 1.3–1.5 m range, the walls are partially obscured by objects such as vehicles. Similarly, in the 1.9~2.1 m range, walls are affected by obstructions like air conditioning units. In contrast, the ranges of 1.5~1.7 m and 1.7~1.9 m show no such interference. Additionally, the wall uniformity in the 1.5~1.7 m range is superior to that in the 1.7~1.9 m range.
2.
Grid size for rasterizing wall point clouds:
To minimize redundant point clouds after horizontal projection, we selected four different grid sizes: 0.05 m, 0.10 m, 0.20 m, and 0.40 m (Figure 19, where black points represent the projected wall points, and red points indicate the rasterized wall points). At grid sizes of 0.05 m and 0.10 m, redundant points appear in some wall areas, which affects the subsequent operation of connecting the initial floor plan. At grid sizes of 0.20 m and 0.40 m, this issue does not occur; however, the precision of the wall at 0.40 m is slightly lower than at 0.20 m.
3.
Angle threshold for simplifying floor plans:
To reduce the number of vertices in the floor plan, we selected three different angle thresholds: 179.5°, 179.0°, and 178.0° (Figure 20, where black points represent the projected wall points, blue lines represent the floor plan connections, and red points represent the vertices of the floor plan). Analysis of the figure shows that with angle thresholds of 179.5° and 179.0°, the simplified floor plan closely matches the walls with no significant deviation. However, at an angle threshold of 178.0°, slight deviations appear in some local areas. Combined with the statistical results in Table 2, an angle threshold of 179.5° yields the best simplification effect, significantly reducing the number of vertices while minimizing the area loss of the floor plan.

4.2. Accuracy Analysis of Outdoor Floor Plans and 3D Models

Using the optimal parameters, the outdoor floor plan of rural houses was generated quickly (Figure 21), along with a simplified 3D model (Figure 22).
The horizontal positional accuracy of the floor plan and the simplified 3D model was compared and analyzed using RTK measurements of the coordinates of the corners of rural house walls (Figure 19 and Figure 23). According to the comparative analysis results in Table 3, the maximum distance between the floor plan wall corners and the RTK-measured coordinates on the horizontal plane was 8.6 cm, the minimum distance was 1.5 cm, the average distance was 4.9 cm, and the standard error was 1.9 cm. The outlier value calculated from the standard error and average distance was 0.106 m, and no observational deviations exceeding this value were found in the detailed check in Table 3. According to relevant standards, the floor plan generated from the outdoor point cloud of rural houses collected by the backpack-mounted LiDAR meets the planar accuracy requirements of a 1:1000 topographic map (Table 4).
Since the simplified 3D model is based on the floor plan, its horizontal positional accuracy is similar. As shown in Figure 24, the distances from the point cloud of the rural house exterior walls to the simplified 3D model are primarily concentrated between −0.1 m and 0.1 m. It can be concluded that the simplified 3D model created using the backpack-mounted LiDAR system in this study has high accuracy.

5. Conclusions

This paper investigates methods for the automated generation of outdoor floor plans and 3D models of rural houses using backpack LiDAR point clouds as the primary data source. The main conclusions are as follows:
  • Automated generation of outdoor floor plans: Based on the point cloud data collected by backpack LiDAR, this paper proposes an outdoor plan drawing algorithm based on the topological relationship between slice and grid wall point clouds. By comparing with the house corner coordinates measured by RTK, it is verified that the plan drawn by this algorithm can meet the accuracy requirements of 1:1000 topographic map.
  • Rapid automated generation of large-scale outdoor 3D models: To address the issue of quickly and automatically generating large-scale outdoor 3D models of rural houses, we proposed an algorithm for rapidly constructing 3D models based on outdoor floor plans. By measuring the distance from the wall point clouds to the 3D model and conducting statistical analysis, the results show that the distance is within 0.1 m.
  • Optimization of key parameters in floor plan generation: The algorithm’s parameters, including the height range of the wall point clouds, the grid size for rasterizing the wall point clouds, and the angle threshold for simplifying the floor plan have been optimized. This ensures that the generated outdoor floor plans and 3D models of rural houses achieve the highest level of accuracy.
  • The outdoor data of rural residences collected using backpack LiDAR in this study does not include roof data. For the roof, we plan to use drone LiDAR in the future to generate roof data, which will then be stitched together to make the 3D model more accurate. We also plan to use a total station to collect data in the test area during the next phase and compare it with the data collected using the backpack and RTK systems.

Author Contributions

Conceptualization, Q.Z. and L.C.; Methodology, Q.Z.; Software, Q.Z. and B.Z.; Validation, Q.Z. and B.Z.; Formal analysis, Q.Z.; Investigation, Q.Z.; Resources, L.C.; Data curation, Q.Z. and B.Z.; Writing—original draft preparation, Q.Z.; Writing—review and editing, Q.Z.; Visualization, Q.Z. and B.Z.; Supervision, L.C.; Funding acquisition, L.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China under grant number 41701597 and was funded by the China Postdoctoral Science Foundation under grant number 2018M642746, the funder is L.C.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

Conflicts of Interest

Bingjie Zhang was employed by Shanghai Huace Navigation Technology Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Di Stefano, F.; Chiappini, S.; Gorreja, A.; Balestra, M.; Pierdicca, R. Mobile 3D scan LiDAR: A literature review. Geomat. Nat. Hazards Risk 2021, 12, 2387–2429. [Google Scholar] [CrossRef]
  2. Klapa, P.; Mitka, B. Edge effect and its impact upon the accuracy of 2d and 3d modelling using laser scanning. Geomat. Landmanag. Landsc. 2017, 1, 25–33. [Google Scholar] [CrossRef]
  3. Warchoł, A.; Karaś, T.; Antoń, M. Selected qualitative aspects of lidar point clouds: Geoslam zeb-revo and faro focus 3d x130. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, XLVIII-1/W3-2023, 205–212. [Google Scholar] [CrossRef]
  4. Kavaliauskas, P.; Fernandez, J.B.; McGuinness, K.; Jurelionis, A. Automation of Construction Progress Monitoring by Integrating 3D Point Cloud Data with an IFC-Based BIM Model. Buildings 2022, 12, 1754. [Google Scholar] [CrossRef]
  5. Li, Y.; Hu, Q.W.; Wu, M.; Liu, J.M.; Wu, X. Extraction and Simplification of Building Facade Pieces from Mobile Laser Scanner Point Clouds for 3D Street View Services. ISPRS Int. J. Geo-Inf. 2016, 5, 231. [Google Scholar] [CrossRef]
  6. Liu, F.Z.; Yang, B.G.; Yang, Y.; Zhao, Y.L.; Zhai, X. Space-constrained Mobile Laser-point Cloud Data Acquisition Method. Sens. Mater. 2023, 35, 929–940. [Google Scholar] [CrossRef]
  7. Wang, X.S.; Chan, T.O.; Liu, K.; Pan, J.; Luo, M.; Li, W.K.; Wei, C.Z. A robust segmentation framework for closely packed buildings from airborne LiDAR point clouds. Int. J. Remote Sens. 2020, 41, 5147–5165. [Google Scholar] [CrossRef]
  8. Fang, H.; Lafarge, F.; Pan, C.; Huang, H. Floorplan generation from 3D point clouds: A space partitioning approach. ISPRS J. Photogramm. Remote Sens. 2021, 175, 44–55. [Google Scholar] [CrossRef]
  9. Gankhuyag, U.; Han, J.H. Automatic 2D Floorplan CAD Generation from 3D Point Clouds. Appl. Sci. 2020, 10, 2817. [Google Scholar] [CrossRef]
  10. Hossein Pouraghdam, M.; Saadatseresht, M.; Rastiveis, H.; Abzal, A.; Hasanlou, M. Building floor plan reconstruction from slam-based point cloud using ransac algorithm. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-4/W18, 483–488. [Google Scholar] [CrossRef]
  11. Okorn, B.; Xiong, X.; Akinci, B. Toward Automated Modeling of Floor Plans. In Proceedings of the Symposium on 3D Data Processing, Visualization and Transmission, Paris, France, 17–20 May 2010; Volume 2. [Google Scholar]
  12. Stojanovic, V.; Trapp, M.; Richter, R.; Döllner, J. Generation of Approximate 2D and 3D Floor Plans from 3D Point Clouds. In Proceedings of the VISIGRAPP, Prague, Czech Republic, 25–27 February 2019. [Google Scholar]
  13. Jiang, T.P.; Wang, Y.J.; Zhang, Z.Q.; Liu, S.; Dai, L.; Yang, Y.C.; Jin, X.; Zeng, W.J. Extracting 3-D Structural Lines of Building From ALS Point Clouds Using Graph Neural Network Embedded with Corner Information. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5702528. [Google Scholar] [CrossRef]
  14. Wu, C.; Chen, X.J.; Jin, T.Y.; Hua, X.H.; Liu, W.X.; Liu, J.Y.; Cao, Y.L.; Zhao, B.F.; Jiang, Y.H.; Hong, Q.W. UAV building point cloud contour extraction based on the feature recognition of adjacent points distribution. Measurement 2024, 230, 114519. [Google Scholar] [CrossRef]
  15. Martens, J.; Blankenbach, J. VOX2BIM+-A Fast and Robust Approach for Automated Indoor Point Cloud Segmentation and Building Model Generation. PFG-J. Photogramm. Remote Sens. Geoinf. Sci. 2023, 91, 273–294. [Google Scholar] [CrossRef]
  16. Phalak, A.; Badrinarayanan, V.; Rabinovich, A. Scan2plan: Efficient floorplan generation from 3d scans of indoor scenes. arXiv 2020, arXiv:2003.07356. [Google Scholar]
  17. Wu, Y.; Xue, F. Floorpp-net: Reconstructing floor plans using point pillars for scan-to-bim. arXiv 2021, arXiv:2106.10635. [Google Scholar]
  18. Poku-Agyemang, K.N.; Reiterer, A. 3D Reconstruction from 2D Plans Exemplified by Bridge Structures. Remote Sens. 2023, 15, 677. [Google Scholar] [CrossRef]
  19. Kong, Q.Z.; Liao, L.J.; Yuan, C. Rapid generation of editable engineering drawings from 3D point cloud reconstruction for large-scale buildings. J. Build. Eng. 2023, 63, 105486. [Google Scholar] [CrossRef]
  20. Bassier, M.; Vergauwen, M. Unsupervised reconstruction of Building Information Modeling wall objects from point cloud data. Autom. Constr. 2020, 120, 103338. [Google Scholar] [CrossRef]
  21. Erdélyi, J.; Honti, R.; Funtík, T.; Mayer, P.; Madiev, A. Verification of Building Structures Using Point Clouds and Building Information Models. Buildings 2022, 12, 2218. [Google Scholar] [CrossRef]
  22. Wen, X.D.; Xie, H.; Liu, H.; Yan, L. Accurate Reconstruction of the LoD3 Building Model by Integrating Multi-Source Point Clouds and Oblique Remote Sensing Imagery. ISPRS Int. J. Geo-Inf. 2019, 8, 135. [Google Scholar] [CrossRef]
  23. Gonzalez-Perez, I.; Fuentes-Aznar, A. Reverse engineering of spiral bevel gear drives reconstructed from point clouds. Mech. Mach. Theory 2022, 170, 104694. [Google Scholar] [CrossRef]
  24. Kazhdan, M.; Chuang, M.; Rusinkiewicz, S.; Hoppe, H. Poisson Surface Reconstruction with Envelope Constraints. Comput. Graph. Forum 2020, 39, 173–182. [Google Scholar] [CrossRef]
  25. Zhu, Z.P.; Zheng, A.Z.; Iglesias, A.; Wang, S.B.; Xia, Y.; Chaudhry, E.; You, L.H.; Zhang, J.J. PDE patch-based surface reconstruction from point clouds. J. Comput. Sci. 2022, 61, 101647. [Google Scholar] [CrossRef]
  26. Chen, J.D.; Kira, Z.; Cho, Y.K. Deep Learning Approach to Point Cloud Scene Understanding for Automated Scan to 3D Reconstruction. J. Comput. Civ. Eng. 2019, 33, 04019027. [Google Scholar] [CrossRef]
  27. Erler, P.; Guerrero, P.; Ohrhallinger, S.; Mitra, N.J.; Wimmer, M. Points2surf Learning Implicit Surfaces from Point Clouds; Springer: Berlin/Heidelberg, Germany, 2020; pp. 108–124. [Google Scholar]
  28. Li, L.; Song, N.; Sun, F.; Liu, X.Y.; Wang, R.S.; Yao, J.; Cao, S.S. Point2Roof: End-to-end 3D building roof modeling from airborne LiDAR point clouds. ISPRS J. Photogramm. Remote Sens. 2022, 193, 17–28. [Google Scholar] [CrossRef]
  29. Kim, M.; Lee, D. Automated two-dimensional geometric model reconstruction from point cloud data for construction quality inspection and maintenance. Autom. Constr. 2023, 154, 105024. [Google Scholar] [CrossRef]
  30. Nikoohemat, S.; Diakité, A.A.; Zlatanova, S.; Vosselman, G. Indoor 3D reconstruction from point clouds for optimal routing in complex buildings to support disaster management. Autom. Constr. 2020, 113, 103109. [Google Scholar] [CrossRef]
  31. Zhao, Y.; Wu, B.; Wu, J.P.; Shu, S.; Liang, H.D.; Liu, M.; Badenko, V.; Fedotov, A.; Yao, S.J.; Yu, B.L. Mapping 3D visibility in an urban street environment from mobile LiDAR point clouds. Giscience Remote Sens. 2020, 57, 797–812. [Google Scholar] [CrossRef]
  32. Shi, W.Z.; Chen, P.X.; Wang, M.Y.; Bao, S.; Xiang, H.D.; Yu, Y.; Yang, D.P. PolyU-BPCoMa: A dataset and benchmark towards mobile colorized mapping using a backpack multisensorial system. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102962. [Google Scholar] [CrossRef]
  33. Zhou, B.D.; Mo, H.Q.; Tang, S.J.; Zhang, X.; Li, Q.Q. Backpack LiDAR-Based SLAM with Multiple Ground Constraints for Multistory Indoor Mapping. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5705516. [Google Scholar] [CrossRef]
  34. Karam, S.; Vosselman, G.; Peter, M.; Hosseinyalamdary, S.; Lehtola, V. Design, Calibration, and Evaluation of a Backpack Indoor Mobile Mapping System. Remote Sens. 2019, 11, 905. [Google Scholar] [CrossRef]
Figure 1. Method for extracting building wall point clouds. (a) Vertical segmentation of building point cloud (b) Histogram of indoor building elevation data.
Figure 1. Method for extracting building wall point clouds. (a) Vertical segmentation of building point cloud (b) Histogram of indoor building elevation data.
Sensors 24 05723 g001
Figure 2. 3D model of building exterior.
Figure 2. 3D model of building exterior.
Sensors 24 05723 g002
Figure 3. Flowchart of the algorithm for drawing outdoor floor plans of rural houses.
Figure 3. Flowchart of the algorithm for drawing outdoor floor plans of rural houses.
Sensors 24 05723 g003
Figure 4. Preprocessing and clipping of outdoor point clouds for rural houses. (a) Outdoor point cloud of rural houses (b) Point cloud of exterior walls of rural houses (c) Clipped point cloud of exterior walls of rural houses.
Figure 4. Preprocessing and clipping of outdoor point clouds for rural houses. (a) Outdoor point cloud of rural houses (b) Point cloud of exterior walls of rural houses (c) Clipped point cloud of exterior walls of rural houses.
Sensors 24 05723 g004aSensors 24 05723 g004b
Figure 5. Diagram of rasterized horizontal projection of wall point cloud. (a) Point cloud after horizontal projection (wall corners) (b) Grid division (c) Calculating the centroid for each grid cell.
Figure 5. Diagram of rasterized horizontal projection of wall point cloud. (a) Point cloud after horizontal projection (wall corners) (b) Grid division (c) Calculating the centroid for each grid cell.
Sensors 24 05723 g005
Figure 6. Diagram of drawing the floor plan using topological relationships of wall points. (a) Perform KNN search with point P as the center (b) Draw a line between point P and the nearest point Q (c) Perform KNN search with point Q as the center (d) Draw a line between point P and the nearest unconnected point N (e) Continue this process to draw the initial floor plan.
Figure 6. Diagram of drawing the floor plan using topological relationships of wall points. (a) Perform KNN search with point P as the center (b) Draw a line between point P and the nearest point Q (c) Perform KNN search with point Q as the center (d) Draw a line between point P and the nearest unconnected point N (e) Continue this process to draw the initial floor plan.
Sensors 24 05723 g006
Figure 7. Diagram of simplifying the floor plan using line segment angle threshold. (a) Adjacent line segments a and b have an angle close to 180 ° (b) Remove the shared point of adjacent line segments a and b (c) Continue this process to simplify the floor plan.
Figure 7. Diagram of simplifying the floor plan using line segment angle threshold. (a) Adjacent line segments a and b have an angle close to 180 ° (b) Remove the shared point of adjacent line segments a and b (c) Continue this process to simplify the floor plan.
Sensors 24 05723 g007
Figure 8. Outdoor floor plan of a rural house.
Figure 8. Outdoor floor plan of a rural house.
Sensors 24 05723 g008
Figure 9. Flowchart of the algorithm for drawing 3D models of rural houses.
Figure 9. Flowchart of the algorithm for drawing 3D models of rural houses.
Sensors 24 05723 g009
Figure 10. Control points of the outdoor 3D model for rural houses.
Figure 10. Control points of the outdoor 3D model for rural houses.
Sensors 24 05723 g010
Figure 11. Diagram of drawing the walls of the outdoor 3D model of rural houses. (a) Connecting the first triangle (b) Connecting the second triangle (c) Continue this process to draw the walls of the outdoor 3D model.
Figure 11. Diagram of drawing the walls of the outdoor 3D model of rural houses. (a) Connecting the first triangle (b) Connecting the second triangle (c) Continue this process to draw the walls of the outdoor 3D model.
Sensors 24 05723 g011
Figure 12. Diagram of drawing the roof of the outdoor 3D model of rural houses. (a) Connecting the first triangle (b) Connecting the second triangle (c) Continue this process to draw the roof of the outdoor 3D model.
Figure 12. Diagram of drawing the roof of the outdoor 3D model of rural houses. (a) Connecting the first triangle (b) Connecting the second triangle (c) Continue this process to draw the roof of the outdoor 3D model.
Sensors 24 05723 g012
Figure 13. Outdoor 3D model of rural houses.
Figure 13. Outdoor 3D model of rural houses.
Sensors 24 05723 g013
Figure 14. Backpack-Mounted LiDAR.
Figure 14. Backpack-Mounted LiDAR.
Sensors 24 05723 g014
Figure 15. Field photo of the study village.
Figure 15. Field photo of the study village.
Sensors 24 05723 g015
Figure 16. Data collection walking route.
Figure 16. Data collection walking route.
Sensors 24 05723 g016
Figure 17. Outdoor point cloud data of rural houses.
Figure 17. Outdoor point cloud data of rural houses.
Sensors 24 05723 g017
Figure 18. Walls at different height clipping ranges. (a) Wall clipping range of 1.3~1.5 m (b) Wall clipping range of 1.5~1.7 m (c) Wall clipping range of 1.7~1.9 m (d) Wall clipping range of 1.9~2.1 m.
Figure 18. Walls at different height clipping ranges. (a) Wall clipping range of 1.3~1.5 m (b) Wall clipping range of 1.5~1.7 m (c) Wall clipping range of 1.7~1.9 m (d) Wall clipping range of 1.9~2.1 m.
Sensors 24 05723 g018
Figure 19. Walls at different grid sizes. (a) Grid size of 0.05 m (b) Grid size of 0.10 m (c) Grid size of 0.20 m (d) Grid size of 0.40 m.
Figure 19. Walls at different grid sizes. (a) Grid size of 0.05 m (b) Grid size of 0.10 m (c) Grid size of 0.20 m (d) Grid size of 0.40 m.
Sensors 24 05723 g019
Figure 20. Floor plans at different angle thresholds. (a) Angle threshold of 179.5° (b) Angle threshold of 179.0° (c) Angle threshold of 178.0° (d) Initial floor plan.
Figure 20. Floor plans at different angle thresholds. (a) Angle threshold of 179.5° (b) Angle threshold of 179.0° (c) Angle threshold of 178.0° (d) Initial floor plan.
Sensors 24 05723 g020
Figure 21. Outdoor floor plans of rural houses.
Figure 21. Outdoor floor plans of rural houses.
Sensors 24 05723 g021
Figure 22. Outdoor 3D models of rural houses.
Figure 22. Outdoor 3D models of rural houses.
Sensors 24 05723 g022
Figure 23. Planar distance between RTK coordinates and corresponding floor plan positions.
Figure 23. Planar distance between RTK coordinates and corresponding floor plan positions.
Sensors 24 05723 g023
Figure 24. Distance from outdoor wall point cloud of rural houses to outdoor 3D models. (a) Northwest direction (b) Northeast direction (c) Southwest direction (d) Southeast direction.
Figure 24. Distance from outdoor wall point cloud of rural houses to outdoor 3D models. (a) Northwest direction (b) Northeast direction (c) Southwest direction (d) Southeast direction.
Sensors 24 05723 g024
Table 1. Parameters of backpack laser SLAM system.
Table 1. Parameters of backpack laser SLAM system.
System ParametersDimensions980 × 260 × 310 mmPositioning PrincipleLaser SLAM+RTK
LiDAR16 Lines × 2Camera/plusPanorama/Ladybug5+
Applicable EnvironmentIndoor and Outdoor Walkable ScenariosOperation ModeBackpack/Lightweight
ProcessorQuad-Core, Eight-ThreadAcquisition Speed<20 km/h
Laser ParametersLiDAR Accuracy±3 cmScanning Field of View360° × 360°
Measurement Range0.5~120 mScanning Frequency600,000 points per second
Data OutputRelative Accuracy≤3 cm Absolute Accuracy≤10 cm
Point Cloud FormatLas, Pcd, PlyPoint Cloud Density1~3 cm
Point Cloud Thickness<5 mmPanorama Density2~6 m
Table 2. Comparison of simplified floor plan and initial floor plan.
Table 2. Comparison of simplified floor plan and initial floor plan.
Type of Floor PlanAngle ThresholdNumber of VerticesArea (m2)Reduction Ratio (Compared to Initial Floor Plan)
Number of VerticesArea
Simplified Floor Plan 179.5 ° 9344302.291455.33%0.001‰
179.0 ° 5234302.024774.99%0.064‰
178.0 ° 3264301.996284.41%0.071‰
Initial Floor PlanNone20914302.29800.00%0.000‰
Table 3. Accuracy analysis of RTK coordinates corresponding to floor plan positions.
Table 3. Accuracy analysis of RTK coordinates corresponding to floor plan positions.
IDRTK CoordinatesFloor Plan CoordinatesCoordinate DifferencePlanar Distance (m) PDOPHRMSVRMS
North Coordinate (m) East Coordinate (m) North Coordinate (m) East Coordinate (m) North Coordinate (m) East Coordinate (m)
1138.957−175.440139.006−175.4230.0490.0170.0521.80.0140.023
2164.145−179.728164.135−179.702−0.0100.0260.0281.90.0130.021
3171.847−179.703171.890−179.6280.0430.0750.0861.60.0180.027
4214.377−187.300214.356−187.275−0.0210.0250.0331.60.0150.026
5235.707−68.154235.679−68.123−0.0280.0310.0422.20.0160.028
6191.697−60.736191.709−60.7980.012−0.0620.0632.70.020.034
7183.036−58.998182.991−59.024−0.045−0.0260.0521.90.0240.032
8156.058−54.905156.083−54.9510.025−0.0460.0521.30.0190.028
9142.260−53.270142.212−53.286−0.048−0.0160.0511.30.0210.032
1070.699−42.85770.717−42.9080.018−0.0510.0541.40.0170.03
1160.952−47.52460.936−47.570−0.016−0.0460.0491.70.0180.034
1218.770−35.76818.834−35.7880.064−0.0200.0671.70.0190.035
13−8.839−186.833−8.779−186.8670.060−0.0340.0691.50.0140.025
1410.856−193.55310.857−193.5960.001−0.0430.0432.40.0170.021
1533.756−197.54633.730−197.528−0.0260.0180.0321.40.0140.028
1659.482−167.31059.502−167.3450.020−0.0350.0401.70.0150.023
1721.680−19.94221.693−19.9310.0130.0110.0171.40.0160.028
1868.650−8.93768.651−8.9090.0010.0280.0281.40.0170.021
1975.405−38.10775.450−38.0880.0450.0190.0491.40.0140.022
20141.834−47.579141.845−47.5690.0110.0100.0151.30.0180.029
21156.088−49.684156.085−49.618−0.0030.0660.0661.20.0260.038
22182.702−53.625182.660−53.672−0.042−0.0470.06320.0240.036
23217.149−59.515217.163−59.4950.0140.0200.0241.60.0180.029
24236.507−61.909236.440−61.887−0.0670.0220.0711.80.0180.026
25261.87383.875261.83383.893−0.0400.0180.0441.40.0160.196
26213.44297.105213.50797.1200.0650.0150.0671.60.0180.025
27204.477104.484204.442104.406−0.035−0.0780.0851.90.0190.029
28167.756112.232167.734112.228−0.022−0.0040.0221.80.0220.031
2999.910123.78999.912123.7640.002−0.0250.0251.50.0170.028
3093.095123.96193.040123.958−0.055−0.0030.0551.90.0160.025
3150.152134.71850.120134.650−0.032−0.0680.0751.70.0130.024
Average Planar Distance (m) 0.049Mean Error (m) 0.019
Table 4. Precision of plane position of ground object points.
Table 4. Precision of plane position of ground object points.
Regional DistributionScalePoint Location Mean ErrorMean Error of Distance between Adjacent Feature Points
Urban, Plain, Hilly Area, Industrial Building Area1:500±0.30±0.20
1:1000±0.60±0.40
1:2000±1.20±0.80
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, Q.; Zhang, B.; Cai, L. An Algorithm for Generating Outdoor Floor Plans and 3D Models of Rural Houses Based on Backpack LiDAR. Sensors 2024, 24, 5723. https://doi.org/10.3390/s24175723

AMA Style

Zhu Q, Zhang B, Cai L. An Algorithm for Generating Outdoor Floor Plans and 3D Models of Rural Houses Based on Backpack LiDAR. Sensors. 2024; 24(17):5723. https://doi.org/10.3390/s24175723

Chicago/Turabian Style

Zhu, Quanshun, Bingjie Zhang, and Lailiang Cai. 2024. "An Algorithm for Generating Outdoor Floor Plans and 3D Models of Rural Houses Based on Backpack LiDAR" Sensors 24, no. 17: 5723. https://doi.org/10.3390/s24175723

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop