Multi-Size Voxel Cube (MSVC) Algorithm—A Novel Method for Terrain Filtering from Dense Point Clouds Using a Deep Neural Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Method Principles
2.2. The Deep Neural Network and Its Training
2.3. Training/Testing Data
2.3.1. Data 1
2.3.2. Data 2
2.4. Testing and Evaluation Procedure
3. Results
3.1. Data 1—Rocks
3.2. Data 2
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
Algorithm | Data Type | Description | Reference |
---|---|---|---|
Slope based | ALS/sparse | For each point in the point cloud, the local ground slope is calculated, and points where the slope value is higher than the selected threshold are considered non-ground points. It is assumed that the points with above-threshold slope are points on buildings, trees or are faulted/removed points. | [13] |
Slope adaptive | ALS/sparse | This is an improvement on the previous method, where the slope threshold is not strictly chosen, but adaptively adjusted based on the wider environment | [14] |
Multi- Directional | ALS/sparse | This method analyzes the gradients between a point and its neighbors in multiple directions, and it is based on the assumption that the neighborhood of ground points is smoother than that of non-ground points. | [15] |
Adaptive Slope | ALS/sparse | This works similarly to the previous one, except that the slope threshold is adaptively adjusted based on local statistics. | [16] |
Triangular Grid Filter | ALS/sparse | A Triangular grid filter based on the Slope Filter, which finds violation points of the spatial position relationship within each point in the triangulation network using improved KD-Tree-Based Euclidean Clustering | [17] |
Modified 3D Alpha Shape | ALS/sparse | Preprocessing for outlier removal and potential ground point extraction; the deployment of a modified 3D alpha shape to construct multiscale point cloud layers; and the use of a multiscale triangulated irregular network (TIN) densification process for precise ground point extraction. | [18] |
Weighted mean | ALS/sparse | The terrain model is roughly estimated, which may include also non-ground points. The terrain points are then iteratively determined by a weighted mean of the surrounding points, where the weight decreases with distance from the predicted terrain | [19] |
TIN densification | ALS/sparse | This method uses a triangular irregular network (TIN) as a representation of the terrain. The algorithm first selects a basic approximation of the terrain defined by a small number of points (usually with the lowest elevations in different parts of the data space) and creates a TIN from these points. New points are added iteratively and the TIN is condensed. Points are added if they fit the expected terrain characteristics (e.g., slopes are not too steep). The thresholds for point selection are adaptively adjusted during the iteration to allow the method to handle variable topography. | [20] |
Repetitive Interpolation | ALS/sparse | The basis of the method consists of iterative iteration between points, where points that do not meet the criteria are gradually eliminated. The iterative interpolation produces an approximated terrain, and points are eliminated based on their distance from this approximation. | [21] |
Adaptive TIN densification | ALS/sparse | Improves [18] by selecting the starting points in the adaptive grid. | [22] |
Progressive Morphological Filter | ALS/sparse | The progressive morphological filter gradually increases the size of the filter window and uses elevation difference thresholds to remove the points of unwanted objects (buildings, vegetation, etc.) while the terrain data is preserved. | [23] |
Simple Morphological Filter | ALS/sparse | The simple morphological filter (SMRF) solves the terrain classification problem using image processing techniques. This filter uses a linearly increasing window and simple slope thresholding, along with a novel application of image completion techniques. | [24] |
Morphological Multi-Gradient | ALS/sparse | A morphological filter that is based on the analysis of multiple gradients for each point. | [25] |
Object-Based Land Cover Classification | ALS/sparse | This method focuses on the relative information content from the height, intensity, and shape of features found in the scene. Eight object metrics were used to classify the terrain into land cover information: average height, the standard deviation of height, height homogeneity, height contrast, height entropy, height correlation, average intensity, and compactness. A machine learning decision tree was used. | [26] |
Contextual Segment-Based | ALS/sparse | This method is based on a conditional random field (CRF), which is a graphical model that can be used to model the relationship between different variables. In this case, the variables are the labels of the different segments in the point cloud. The CRF is trained on a dataset of labeled point clouds, and then it can be used to classify new point clouds. | [27] |
Skewness and Kurtosis | ALS/sparse | Iterative analysis of the skewness and kurtosis of the statistical distribution around a point. | [28] |
CSF | ALS/sparse | Based on the idea of simulating a cloth that is dropped onto the inverted point cloud. The cloth will naturally drape over the surface of the point cloud, and the points that are covered by the cloth can be classified as ground points. There are only three parameters defining the behavior of the cloth—the distance of the points defining the deformation of the cloth (square grid), the pliability of the cloth (in the form of three scenarios—steep slope, relief, and flat), and the possibility of slope processing—i.e., whether the terrain can climb very steeply. Since the number (density) of points defining the spatial area of the cloth is small relative to the original point cloud, usually all points of the point cloud closer to the cloth than a defined constant (threshold) are considered terrain points. | [29] |
Complementary Cloth Simulation and Progressive TIN Densification | ALS/sparse | This combines the methods of cloth simulation and progressive TIN densification. This hybrid approach exploits the strengths of both methods: the accuracy of cloth simulation in detecting terrain in challenging environments and the robustness of progressive TIN densification in removing non-terrain points. | [30] |
MDSR | Dense | This method is based on the idea of rasterizing the point cloud from multiple directions and with multiple shifts of the raster grid to identify ground points. | [31] |
Algorithm | Data Type | Description | Reference |
---|---|---|---|
Combined Structural and geometrical filtering | Dense | This method combines structural filtering using the CANUPO tool and a geometric filter (preferably CSF) applied to the data in the horizontal position (rock wall transformed so that the fitted plane is horizontal) | [32] |
RGB filtering | Dense | The method presented here performs point cloud filtering based on non-geometric point parameters—based on color (RGB). This is done using both a neural network and an approximation of the color spaces of each class, using an originally designed automatic method based on 3D Gaussian mixed models. The method has a wider scope, and it can be also used for general color-based classification. | [33] |
Spatial and spectral characteristics | Dense | The method presented here uses a neural network for the ground filtering of the point clouds of coastal salt marches (essentially flat terrain with ground vegetation), the input data are not coordinates but spatial (e.g., height) and spectral characteristics (e.g., eigenvalues calculated from the spherical neighborhood). The test data were acquired using a UAV lidar system. | [34] |
Spatial and spectral characteristics | ALS/sparse | Spatial and spectral features used for machine learning; testing was performed on flat terrain data with buildings and tall vegetation acquired by ALS | [35] |
Local characteristics—edge convolution | ALS/sparse | A method using spatial and local characteristics here in addition to the reduced position of the point, e.g., roughness. A special feature is the use of the edge convolution operation. | [1] |
Voxelization and 3D convolution | ALS/sparse | The voxelization and post-processing is performed using 3D convolutions and max-pooling; the calculation is demanding and tested again on ALS data, where the maximum number of points in the cloud is 1 million, with a density 5–10 points per square meter. The terrain is flat. | [36] |
Transformation to image like structure | ALS/sparse | This uses the transformation of the point’s surroundings into an image form and its further processing by a convolutional neural network, using specially designed local characteristics instead of RGB. Again, ALS (8 points/m2, 700 k points), flat terrain with buildings. | [37] |
Point clouds projected to multidimensional image | ALS/sparse | The point clouds are first projected onto a horizontal plane and converted into a multidimensional image, using pixel sizes of 0.5 m and 1 m. This is then analyzed using a multiscale fully convolutional network. Flat terrain. | [38] |
CNN under slope and copula correlation constraint | ALS/sparse | Farthest point sampling with slope constraints, intra-class feature enhancements via copula correlation and attention mechanisms, filter error correction using copula correlation and confidence intervals, and the refinement of filtering accuracy by adjusting for negatively correlated point sets. | [39] |
Multi-Scale and Multi-View Deep Features | ALS/sparse | Elevation features, spectral features, and geometric features are used. The cloud is voxelized, and feature maps generated by projections onto three orthogonal planes are used for classification. The classification is performed using a fully convolutional network (FCN). Test dataset—point cloud showing flat terrain obtained by ALS (with all features as usual). | [40] |
Iterative sequential terrain prediction | ALS/sparse | This converts the terrain filtering problem into an iterative sequential prediction problem using point profiles. Uses deep reinforcement learning (DRL): DRL optimizes the prediction sequence and sequentially acquires the bare terrain. | [41] |
2D projection of 3D features | ALS/sparse | This uses point cloud transformation into voxel representation and the 2D projection of 3D features; classification is done by convolutional neural network. | [42] |
Multi-Scale CNN with Attention Mechanism | ALS/sparse | This classifies the point cloud by transforming it into a 2D image, where the transformed height is used instead of colors. The classification itself is performed using a convolutional neural network (CNN). | [43] |
Vertical Slice Equal Sampling | ALS/sparse | Locally samples the original point cloud, organizing the unordered sequence of points and reducing their number while maintaining the terrain’s representation, and then classification is performed using a specially designed transformer network | [44] |
PointNet | All | Neural network architecture designed for the direct processing of point clouds without the need to convert them into regular 3D voxel grids or collections of images. PointNet respects the permutation invariance of points in a point cloud and provides a unified architecture for applications such as object classification. | [45] |
PointNet++ | All | An enhanced version of PointNet. PointNet++ focuses on learning hierarchical features on point sets in a metric space. It addresses the limitations of the original PointNet by capturing local structures induced by the metric space, improving the ability to recognize fine patterns and generalize complex scenes. | [46] |
Point Cloud Binary Voxelization | ALS/Sparse | Point clouds are converted into a binary voxel-based data (BVD) model, where each voxel has a value of 1 or 0 depending on whether it contains LiDAR points. The algorithm selects the lowest voxels with a value of 1 as ground seeds and then labels them and their 3D-connected set as ground voxels. | [47] |
Appendix B
Appendix C. Definition a Neural Network in Python Using the Tensorflow Library
import numpy import tensorflow as tf import keras from keras import regularizers
kernel = 9; k3 = numpy.power(kernel,3) n1 = int(2*k3) add_layers_number = 7 # number of hidden layers regu = 0.001 # coefficient L2 regularization drop = 0.25 # drop rate 25%
def CreateNetModel_T2(add_layers_number, n1, k3, regu, drop): model = keras.Sequential() model.add(keras.Input(shape=(k3,)))
for i in range(add_layers_number): model.add(keras.layers.Dense(int(n1/np.power(2,i)), activation=’relu’, kernel_regularizer=regularizers.L2(regu))) model.add(keras.layers.Dropout(drop))
model.add(keras.layers.Dense(1, activation=’sigmoid’)) model.compile(loss=’binary_crossentropy’, optimizer=’adam’, metrics=[’accuracy’])
return model
Appendix D. Complete Classification Results for Data 2
Method | Cloth Resolution/Voxel Size [m] | Threshold [m] | TPR [%] | TNR [%] | BA [%] | FS [%] |
---|---|---|---|---|---|---|
CSF | 0.025 | 0.250 | 99.35 | 99.57 | 99.46 | 99.32 |
0.050 | 99.00 | 99.67 | 99.34 | 99.23 | ||
0.100 | 97.97 | 99.75 | 98.86 | 98.77 | ||
0.250 | 90.23 | 99.81 | 95.02 | 94.71 | ||
0.025 | 0.200 | 98.85 | 99.64 | 99.24 | 99.13 | |
0.050 | 98.13 | 99.77 | 98.95 | 98.87 | ||
0.100 | 96.66 | 99.85 | 98.26 | 98.18 | ||
0.250 | 86.90 | 99.88 | 93.39 | 92.90 | ||
0.025 | 0.150 | 97.45 | 99.71 | 98.58 | 98.47 | |
0.050 | 95.71 | 99.85 | 97.78 | 97.68 | ||
0.100 | 93.25 | 99.92 | 96.58 | 96.44 | ||
0.250 | 80.93 | 99.93 | 90.43 | 89.41 | ||
MSVC | 0.110 | - | 99.30 | 99.79 | 99.54 | 99.47 |
0.140 | 99.67 | 99.69 | 99.68 | 99.58 | ||
0.190 | 99.88 | 99.57 | 99.72 | 99.59 | ||
0.250 | 100.00 | 99.16 | 99.58 | 99.32 |
Method | Cloth Resolution/Voxel Size [m] | Threshold [m] | TPR [%] | TNR [%] | BA [%] | FS [%] |
---|---|---|---|---|---|---|
CSF | 0.025 | 0.250 | 99.52 | 97.11 | 98.32 | 96.97 |
0.050 | 99.29 | 97.41 | 98.35 | 97.14 | ||
0.100 | 98.61 | 97.56 | 98.08 | 96.93 | ||
0.250 | 94.33 | 98.07 | 96.20 | 95.20 | ||
0.025 | 0.200 | 99.33 | 97.37 | 98.35 | 97.12 | |
0.050 | 99.00 | 97.79 | 98.39 | 97.35 | ||
0.100 | 98.09 | 97.97 | 98.03 | 97.06 | ||
0.250 | 92.51 | 98.46 | 95.49 | 94.61 | ||
0.025 | 0.150 | 98.60 | 97.92 | 98.26 | 97.27 | |
0.050 | 97.93 | 98.57 | 98.25 | 97.56 | ||
0.100 | 96.41 | 98.78 | 97.59 | 96.98 | ||
0.250 | 88.61 | 99.12 | 93.86 | 93.10 | ||
MSVC | 0.110 | - | 99.78 | 98.00 | 98.89 | 97.95 |
0.140 | 99.79 | 97.75 | 98.77 | 97.71 | ||
0.190 | 99.80 | 97.57 | 98.69 | 97.55 | ||
0.250 | 99.81 | 97.43 | 98.62 | 97.42 |
Method | Cloth Resolution/Voxel Size [m] | Threshold [m] | TPR [%] | TNR [%] | BA [%] | FS [%] |
---|---|---|---|---|---|---|
CSF | 0.025 | 0.250 | 99.19 | 97.56 | 98.37 | 97.78 |
0.050 | 98.51 | 98.16 | 98.33 | 97.88 | ||
0.100 | 98.17 | 98.60 | 98.38 | 98.03 | ||
0.250 | 93.77 | 98.96 | 96.37 | 96.01 | ||
0.025 | 0.200 | 99.33 | 98.52 | 97.85 | 98.18 | |
0.050 | 99.00 | 97.33 | 98.56 | 97.95 | ||
0.100 | 98.09 | 96.38 | 99.03 | 97.70 | ||
0.250 | 92.51 | 90.66 | 99.31 | 94.98 | ||
0.025 | 0.150 | 96.76 | 98.18 | 97.47 | 97.00 | |
0.050 | 94.23 | 98.93 | 96.58 | 96.23 | ||
0.100 | 91.96 | 99.37 | 95.66 | 95.34 | ||
0.250 | 84.48 | 99.58 | 92.03 | 91.27 | ||
MSVC | 0.110 | - | 99.20 | 98.82 | 99.01 | 98.71 |
0.140 | 99.64 | 98.46 | 99.05 | 98.67 | ||
0.190 | 99.81 | 98.06 | 98.94 | 98.47 | ||
0.250 | 99.87 | 97.66 | 98.76 | 98.19 |
References
- Ciou, T.-S.; Lin, C.-H.; Wang, C.-K. Airborne LiDAR Point Cloud Classification Using Ensemble Learning for DEM Generation. Sensors 2024, 24, 6858. [Google Scholar] [CrossRef] [PubMed]
- Wegner, K.; Durand, V.; Villeneuve, N.; Mangeney, A.; Kowalski, P.; Peltier, A.; Stark, M.; Becht, M.; Haas, F. Multitemporal Quantification of the Geomorphodynamics on a Slope within the Cratére Dolomieu—At the Piton de La Fournaise (La Réunion, Indian Ocean) Using Terrestrial LiDAR Data, Terrestrial Photographs, and Webcam Data. Geosciences 2024, 14, 259. [Google Scholar] [CrossRef]
- Peralta, T.; Menoscal, M.; Bravo, G.; Rosado, V.; Vaca, V.; Capa, D.; Mulas, M.; Jordá-Bordehore, L. Rock Slope Stability Analysis Using Terrestrial Photogrammetry and Virtual Reality on Ignimbritic Deposits. J. Imaging 2024, 10, 106. [Google Scholar] [CrossRef]
- Treccani, D.; Adami, A.; Brunelli, V.; Fregonese, L. Mobile Mapping System for Historic Built Heritage and GIS Integration: A Challenging Case Study. Appl. Geomat. 2024, 16, 293–312. [Google Scholar] [CrossRef]
- Marčiš, M.; Fraštia, M.; Lieskovský, T.; Ambroz, M.; Mikula, K. Photogrammetric Measurement of Grassland Fire Spread: Techniques and Challenges with Low-Cost Unmanned Aerial Vehicles. Drones 2024, 8, 282. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Křemen, T.; Braun, J. UAV DTM Acquisition in a Forested Area—Comparison of Low-Cost Photogrammetry (DJI Zenmuse P1) and LiDAR Solutions (DJI Zenmuse L1). Eur. J. Remote Sens. 2023, 56, 2179942. [Google Scholar] [CrossRef]
- Marotta, F.; Teruggi, S.; Achille, C.; Vassena, G.P.M.; Fassi, F. Integrated Laser Scanner Techniques to Produce High-Resolution DTM of Vegetated Territory. Remote Sens. 2021, 13, 2504. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Křemen, T.; Braun, J.; Michal, O.; Jiřikovský, T. Scanning the Underground: Comparison of the Accuracies of SLAM and Static Laser Scanners in a Mine Tunnel. Measurement 2024, 242, 115875. [Google Scholar] [CrossRef]
- Pavelka, K., Jr.; Běloch, L.; Pavelka, K. Modern Methods of Documentation and Visualization of Historical Mines in the Unesco Mining Region in the Ore Mountains. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, X-M-1-2023, 237–244. [Google Scholar] [CrossRef]
- Meng, X.; Currit, N.; Zhao, K. Ground Filtering Algorithms for Airborne LiDAR Data: A Review of Critical Issues. Remote Sens. 2010, 2, 833–860. [Google Scholar] [CrossRef]
- Qin, N.; Tan, W.; Guan, H.; Wang, L.; Ma, L.; Tao, P.; Fatholahi, S.; Hu, X.; Li, J. Towards Intelligent Ground Filtering of Large-Scale Topographic Point Clouds: A Comprehensive Survey. Int. J. Appl. Earth Obs. Geoinf. 2023, 125, 103566. [Google Scholar] [CrossRef]
- Chen, C.; Guo, J.; Wu, H.; Li, Y.; Shi, B. Performance Comparison of Filtering Algorithms for High-Density Airborne LiDAR Point Clouds over Complex LandScapes. Remote Sens. 2021, 13, 2663. [Google Scholar] [CrossRef]
- Vosselman, G. Slope based filtering of laser altimetry data. Int. Arch. Photogramm. Remote Sens. 2000, 33, 935–942. [Google Scholar]
- Sithole, G. Filtering of laser altimetry data using a slope adaptive filter. Int. Arch. Photogramm. Remote Sens. 2001, 34, 203–210. [Google Scholar]
- Meng, X.; Wang, L.; Silván-Cárdenas, J.L.; Currit, N. A Multi-Directional Ground Filtering Algorithm for Airborne LIDAR. ISPRS J. Photogramm. Remote Sens. 2008, 64, 117–124. [Google Scholar] [CrossRef]
- Susaki, J. Adaptive Slope Filtering of Airborne LiDAR Data in Urban Areas for Digital Terrain Model (DTM) Generation. Remote Sens. 2012, 4, 1804–1819. [Google Scholar] [CrossRef]
- Kang, C.; Lin, Z.; Wu, S.; Lan, Y.; Geng, C.; Zhang, S. A Triangular Grid Filter Method Based on the Slope Filter. Remote Sens. 2023, 15, 2930. [Google Scholar] [CrossRef]
- Cao, D.; Wang, C.; Du, M.; Xi, X. A Multiscale Filtering Method for Airborne LiDAR Data Using Modified 3D Alpha Shape. Remote Sens. 2024, 16, 1443. [Google Scholar] [CrossRef]
- Kraus, K.; Pfeifer, N. Determination of Terrain Models in Wooded Areas with Airborne Laser Scanner Data. ISPRS J. Photogramm. Remote Sens. 1998, 53, 193–203. [Google Scholar] [CrossRef]
- Axelsson, P. DEM generation from laser scanner data using adaptive TIN models. Int. Arch. Photogramm. Remote Sens. 2000, 33, 111–118. [Google Scholar]
- Kobler, A.; Pfeifer, N.; Ogrinc, P.; Todorovski, L.; Oštir, K.; Džeroski, S. Repetitive Interpolation: A Robust Algorithm for DTM Generation from Aerial Laser Scanner Data in Forested Terrain. Remote Sens. Environ. 2006, 108, 9–23. [Google Scholar] [CrossRef]
- Zheng, J.; Xiang, M.; Zhang, T.; Zhou, J. An Improved Adaptive Grid-Based Progressive Triangulated Irregular Network Densification Algorithm for Filtering Airborne LiDAR Data. Remote Sens. 2024, 16, 3846. [Google Scholar] [CrossRef]
- Zhang, K.; Chen, S.-C.; Whitman, D.; Shyu, M.-L.; Yan, J.; Zhang, C. A Progressive Morphological Filter for Removing Nonground Measurements from Airborne LIDAR Data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 872–882. [Google Scholar] [CrossRef]
- Pingel, T.J.; Clarke, K.C.; McBride, W.A. An Improved Simple Morphological Filter for the Terrain Classification of Airborne LIDAR Data. ISPRS J. Photogramm. Remote Sens. 2013, 77, 21–30. [Google Scholar] [CrossRef]
- Li, Y. Filtering Airborne Lidar Data by an Improved Morphological Method Based on Multi-Gradient Analysis. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W1, 191–194. [Google Scholar] [CrossRef]
- Im, J.; Jensen, J.R.; Hodgson, M.E. Object-Based Land Cover Classification Using High-Posting-Density LiDAR Data. GIScience Remote Sens. 2008, 45, 209–228. [Google Scholar] [CrossRef]
- Vosselman, G.; Coenen, M.; Rottensteiner, F. Contextual Segment-Based Classification of Airborne Laser Scanner Data. ISPRS J. Photogramm. Remote Sens. 2017, 128, 354–371. [Google Scholar] [CrossRef]
- Crosilla, F.; Macorig, D.; Scaioni, M.; Sebastianutti, I.; Visintini, D. LiDAR Data Filtering and Classification by Skewness and Kurtosis Iterative Analysis of Multiple Point Cloud Data Categories. Appl. Geomat. 2013, 5, 225–240. [Google Scholar] [CrossRef]
- Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An Easy-to-Use Airborne LiDAR Data Filtering Method Based on Cloth Simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
- Cai, S.; Zhang, W.; Liang, X.; Wan, P.; Qi, J.; Yu, S.; Yan, G.; Shao, J. Filtering Airborne LiDAR Data Through Complementary Cloth Simulation and Progressive TIN Densification Filters. Remote Sens. 2019, 11, 1037. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Línková, L. Multidirectional Shift Rasterization (MDSR) Algorithm for Effective Identification of Ground in Dense Point Clouds. Remote Sens. 2022, 14, 4916. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Lidmila, M.; Kolář, V.; Křemen, T. Vegetation Filtering of a Steep Rugged Terrain: The Performance of Standard Algorithms and a Newly Proposed Workflow on an Example of a Railway Ledge. Remote Sens. 2021, 13, 3050. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Línková, L. Color-Based Point Cloud Classification Using a Novel Gaussian Mixed Modeling-Based Approach versus a Deep Neural Network. Remote Sens. 2024, 16, 115. [Google Scholar] [CrossRef]
- Liu, K.; Liu, S.; Tan, K.; Yin, M.; Tao, P. ANN-Based Filtering of Drone LiDAR in Coastal Salt Marshes Using Spatial–Spectral Features. Remote Sens. 2024, 16, 3373. [Google Scholar] [CrossRef]
- Nurunnabi, A.; Teferle, F.N.; Li, J.; Lindenbergh, R.C.; Hunegnaw, A. An Efficient Deep Learning Approach for Ground Point Filtering in Aerial Laser Scanning Point Clouds. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, XLIII-B1-2021, 31–38. [Google Scholar] [CrossRef]
- Zhang, Z.; Sun, L.; Zhong, R.; Chen, D.; Zhang, L.; Li, X.; Wang, Q.; Chen, S. Hierarchical Aggregated Deep Features for ALS Point Cloud Classification. IEEE Trans. Geosci. Remote Sens. 2020, 59, 1686–1699. [Google Scholar] [CrossRef]
- Chen, R.; Wu, J.; Zhao, X.; Luo, Y.; Xu, G. SC-CNN: LiDAR Point Cloud Filtering CNN under Slope and Copula Correlation Constraint. ISPRS J. Photogramm. Remote Sens. 2024, 212, 381–395. [Google Scholar] [CrossRef]
- Yang, Z.; Jiang, W.; Xu, B.; Zhu, Q.; Jiang, S.; Huang, W. A Convolutional Neural Network-Based 3D Semantic Labeling Method for ALS Point Clouds. Remote Sens. 2017, 9, 936. [Google Scholar] [CrossRef]
- Rizaldy, A.; Persello, C.; Gevaert, C.; Elberink, S.O.; Vosselman, G. Ground and Multi-Class Classification of Airborne Laser Scanner Point Clouds Using Fully Convolutional Networks. Remote Sens. 2018, 10, 1723. [Google Scholar] [CrossRef]
- Lei, X.; Wang, H.; Wang, C.; Zhao, Z.; Miao, J.; Tian, P. ALS Point Cloud Classification by Integrating an Improved Fully Convolutional Network into Transfer Learning with Multi-Scale and Multi-View Deep Features. Sensors 2020, 20, 6969. [Google Scholar] [CrossRef]
- Dai, H.; Hu, X.; Shu, Z.; Qin, N.; Zhang, J. Deep Ground Filtering of Large-Scale ALS Point Clouds via Iterative Sequential Ground Prediction. Remote Sens. 2023, 15, 961. [Google Scholar] [CrossRef]
- Dai, H.; Hu, X.; Zhang, J.; Shu, Z.; Xu, J.; Du, J. Large-Scale ALS Point Clouds Segmentation via Projection-Based Context Embedding. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–16. [Google Scholar] [CrossRef]
- Wang, B.; Wang, H.; Song, D. A Filtering Method for LiDAR Point Cloud Based on Multi-Scale CNN with Attention Mechanism. Remote Sens. 2022, 14, 6170. [Google Scholar] [CrossRef]
- Wen, W.; Yang, R.; Tan, J.; Liu, H.; Tan, J. Vertical Slice Equal Sampling and Transformer Network for Point Cloud Ground Filtering. Int. J. Remote Sens. 2024, 45, 4710–4736. [Google Scholar] [CrossRef]
- Qi, C.R.; Su, H.; Mo, K.; Guibas, L.J. PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In Proceedings of the Fourth International Conference on 3D Vision, Stanford, CA, USA, 25–28 October 2016; pp. 601–610. [Google Scholar]
- Qi, C.R.; Yi, L.; Su, H.; Guibas, L.J. PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space. In Proceedings of the 31st Conference on Neural Information Processing System (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
- Wang, L.; Xu, Y.; Li, Y. Aerial Lidar Point Cloud Voxelization with Its 3D Ground Filtering Application. Photogramm. Eng. Remote Sens. 2017, 83, 95–107. [Google Scholar] [CrossRef]
- You, S.-H.; Jang, E.J.; Kim, M.-S.; Lee, M.-T.; Kang, Y.-J.; Lee, J.-E.; Eom, J.-H.; Jung, S.-Y. Change Point Analysis for Detecting Vaccine Safety Signals. Vaccines 2021, 9, 206. [Google Scholar] [CrossRef]
- Kovanič, Ľ.; Peťovský, P.; Topitzer, B.; Blišťan, P. Spatial Analysis of Point Clouds Obtained by SfM Photogrammetry and the TLS Method—Study in Quarry Environment. Land 2024, 13, 614. [Google Scholar] [CrossRef]
- Braun, J.; Braunová, H.; Suk, T.; Michal, O.; Peťovský, P.; Kuric, I. Structural and Geometrical Vegetation Filtering—Case Study on Mining Area Point Cloud Acquired by UAV Lidar. Acta Montan. Slovaca 2022, 26, 661–674. [Google Scholar] [CrossRef]
- Kovanič, Ľ.; Štroner, M.; Urban, R.; Blišťan, P. Methodology and Results of Staged UAS Photogrammetric Rockslide Monitoring in the Alpine Terrain in High Tatras, Slovakia, after the Hydrological Event in 2022. Land 2023, 12, 977. [Google Scholar] [CrossRef]
Area | Dimensions [m] | Number of Points | Mean Resolution [m] |
---|---|---|---|
Data 2 Training | 74 × 65 × 38 | 11,454,057 | 0.04 |
Data 2 Boulders | 50 × 42 × 22 | 3,726,774 | 0.05 |
Data 2 Tower | 85 × 72 × 26 | 20,941,671 | 0.03 |
Data 2 Rugged | 100 × 53 × 27 | 7,569,811 | 0.05 |
Characteristics | Abbreviation | Calculation |
---|---|---|
True positive rate | TPR | TPR = TP/(TP + FN) |
True negative rate | TNR | TNR = TN/(TN + FP) |
Balanced accuracy | BA | BA = (TPR + TNR)/2 |
F-score | FS | FS = 2TP/(2TP + FP + FN) |
Method | Cloth Resolution/Voxel Size [m] | Threshold [m] | TPR [%] | TNR [%] | BA [%] | FS [%] |
---|---|---|---|---|---|---|
CSF | 0.025 | 0.25 | 89.18 | 75.66 | 82.42 | 92.57 |
0.050 | 87.88 | 77.04 | 82.46 | 91.93 | ||
0.100 | 86.20 | 78.08 | 82.14 | 91.05 | ||
0.025 | 0.20 | 87.66 | 78.16 | 82.91 | 91.89 | |
0.050 | 86.17 | 79.75 | 82.96 | 91.15 | ||
0.100 | 84.07 | 80.97 | 82.52 | 90.01 | ||
0.025 | 0.15 | 85.33 | 81.51 | 83.42 | 90.78 | |
0.050 | 83.53 | 83.35 | 83.44 | 89.86 | ||
0.100 | 80.70 | 84.79 | 82.75 | 88.25 | ||
MSVC | 0.060 | - | 99.94 | 76.61 | 88.28 | 98.32 |
0.080 | 99.94 | 74.91 | 87.43 | 98.20 | ||
0.110 | 99.97 | 72.31 | 86.14 | 98.03 | ||
0.140 | 99.97 | 70.43 | 85.20 | 97.90 | ||
0.190 | 99.98 | 68.40 | 84.19 | 97.77 |
Method | Data Area | Cloth Resolution/Voxel Size [m] | Threshold [m] | TPR [%] | TNR [%] | BA [%] | FS [%] |
---|---|---|---|---|---|---|---|
CSF | Boulders | 0.025 | 0.25 | 99.35 | 99.57 | 99.46 | 99.32 |
MSVC | 0.140 | - | 99.67 | 99.69 | 99.68 | 99.58 | |
CSF | Tower | 0.050 | 0.15 | 97.93 | 98.57 | 98.25 | 97.56 |
MSVC | 0.110 | - | 99.78 | 98.00 | 98.89 | 97.95 | |
CSF | Rugged | 0.050 | 0.25 | 98.51 | 98.16 | 98.33 | 97.88 |
MSVC | 0.110 | - | 99.20 | 98.82 | 99.01 | 98.71 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Štroner, M.; Boušek, M.; Kučera, J.; Váchová, H.; Urban, R. Multi-Size Voxel Cube (MSVC) Algorithm—A Novel Method for Terrain Filtering from Dense Point Clouds Using a Deep Neural Network. Remote Sens. 2025, 17, 615. https://doi.org/10.3390/rs17040615
Štroner M, Boušek M, Kučera J, Váchová H, Urban R. Multi-Size Voxel Cube (MSVC) Algorithm—A Novel Method for Terrain Filtering from Dense Point Clouds Using a Deep Neural Network. Remote Sensing. 2025; 17(4):615. https://doi.org/10.3390/rs17040615
Chicago/Turabian StyleŠtroner, Martin, Martin Boušek, Jakub Kučera, Hana Váchová, and Rudolf Urban. 2025. "Multi-Size Voxel Cube (MSVC) Algorithm—A Novel Method for Terrain Filtering from Dense Point Clouds Using a Deep Neural Network" Remote Sensing 17, no. 4: 615. https://doi.org/10.3390/rs17040615
APA StyleŠtroner, M., Boušek, M., Kučera, J., Váchová, H., & Urban, R. (2025). Multi-Size Voxel Cube (MSVC) Algorithm—A Novel Method for Terrain Filtering from Dense Point Clouds Using a Deep Neural Network. Remote Sensing, 17(4), 615. https://doi.org/10.3390/rs17040615