Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experiment Description
2.2. Platform
2.3. Test Area and Data Used in the Experiment
2.4. Methodology of Accuracy Assessment
- DTM from ULS data (DTMULS): for each cell of a 0.25 and 0.5 m grid, the height value was calculated by averaging observations (points); in the cases of cells without points inside, the Natural Neighbor interpolation method was used;
- DSM from ULS data (DSMULS): for each cell of a 0.25 and 0.5 m grid, the height value was calculated by picking the maximum value from observations (points); in the cases of cells without points inside, the Natural Neighbor interpolation method was used;
- DTM from DIM point cloud (DTMDIM): for each cell of a 0.25 and 0.5 m grid, the height value was calculated by averaging observations (points); in the cases of cells without points inside, the Natural Neighbor interpolation method was used.
3. Results
3.1. Accuracy Assessment of DTMs, Based on Surveying Field Measurements
3.2. The Comparison of the DTMs with ALS Data
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Baltsavias, E.P. A comparison between photogrammetry and laser scanning. ISPRS J. Photogramm. Remote Sens. 1999, 54, 83–94. [Google Scholar] [CrossRef]
- Gehrke, S.; Morin, K.; Downey, M.; Boehrer, N.; Fuchs, T. Semi-global matching: An alternative to LIDAR for DSM generation. In Proceedings of the 2010 Canadian Geomatics Conference and Symposium of Commission I, Calgary, AB, Canada, 14–18 June 2010; Volume 2, p. 6. [Google Scholar]
- Ressl, C.; Brockmann, H.; Mandlburger, G.; Pfeifer, N. Dense Image Matching vs. Airborne Laser Scanning–Comparison of two methods for deriving terrain models. Photogramm. Fernerkund. Géoinf. 2016, 2016, 57–73. [Google Scholar] [CrossRef]
- Tournadre, V.; Pierrot-Deseilligny, M.; Faure, P.H. UAV photogrammetry to monitor dykes-calibration and comparison to terrestrial Lidar. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 143–148. [Google Scholar] [CrossRef]
- Markiewicz, J.S.; Zawieska, D. Terrestrial scanning or digital images in inventory of monumental objects?—Case study. Int. Arch. Photogramm. Remote. Sens. Spat. Inf. Sci. 2014, 40, 395–400. [Google Scholar] [CrossRef]
- Nadal-Romero, E.; Revuelto, J.; Errea, P.; López-Moreno, J.I. The application of terrestrial laser scanner and SfM photogrammetry in measuring erosion and deposition processes in two opposite slopes in a humid badlands area (central Spanish Pyrenees). Soil 2015, 1, 561–573. [Google Scholar] [CrossRef] [Green Version]
- Riquelme, A.; Cano, M.; Tomás, R.; Abellán, A. Identification of Rock Slope Discontinuity Sets from Laser Scanner and Photogrammetric Point Clouds: A Comparative Analysis. Procedia Eng. 2017, 191, 835–845. [Google Scholar] [CrossRef]
- Bakuła, K.; Stępnik, M.; Kurczyński, Z. Influence of Elevation Data Source on 2D Hydraulic Modelling. Acta Geophys. 2016, 64, 1176–1192. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 1–110. [Google Scholar] [CrossRef]
- Hirschmuller, H. Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 328–341. [Google Scholar] [CrossRef] [PubMed]
- Furukawa, Y.; Ponce, J. Accurate, dense, and robust multiview stereopsis. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1362–1376. [Google Scholar] [CrossRef] [PubMed]
- Remondino, F.; Spera, M.G.; Nocerino, E.; Menna, F.; Nex, F. State of the art in high density image matching. Photogramm. Rec. 2014, 29, 144–166. [Google Scholar] [CrossRef]
- Jensen, J.L.; Mathews, A.J. Assessment of image-based point cloud products to generate a bare earth surface and estimate canopy heights in a woodland ecosystem. Remote Sens. 2016, 8, 50. [Google Scholar] [CrossRef]
- Fernández-Hernandez, J.; González-Aguilera, D.; Rodríguez-Gonzálvez, P.; Mancera-Taboada, J. Image-Based Modelling from Unmanned Aerial Vehicle (UAV) Photogrammetry: An Effective, Low-Cost Tool for Archaeological Applications. Archaeometry 2015, 57, 128–145. [Google Scholar] [CrossRef]
- Goncalves, J.A.; Henriques, R. UAV photogrammetry for topographic monitoring of coastal areas. ISPRS J. Photogramm. Remote Sens. 2015, 104, 101–111. [Google Scholar] [CrossRef]
- Hodgson, M.E.; Jensen, J.R.; Schmidt, L.; Schill, S.; Davis, B. An evaluation of LIDAR- and IFSAR-derived digital elevation models in leaf-on conditions with USGS Level 1 and Level 2 DEMs. Remote Sens. Environ. 2003, 84, 295–308. [Google Scholar] [CrossRef]
- Simpson, J.E.; Smith, T.E.; Wooster, M.J. Assessment of Errors Caused by Forest Vegetation Structure in Airborne LiDAR-Derived DTMs. Remote Sens. 2017, 9, 1101. [Google Scholar] [CrossRef]
- Serifoglu, C.; Gungor, O.; Yilmaz, V. Performance evaluation of different ground filtering algorithms for UAV-based point clouds. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 245–251. [Google Scholar] [CrossRef]
- Wallace, L.; Luckier, A.; Malenovský, Z.; Turner, D.; Vopěnka, P. Assessment of forest structure using two UAV techniques: A comparison of airborne laser scanning and structure from motion (SfM) point clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef]
- Gruszczyński, W.; Matwij, W.; Ćwiąkała, P. Comparison of low-altitude UAV photogrammetry with terrestrial laser scanning as data-source methods for terrain covered in low vegetation. ISPRS J. Photogramm. Remote Sens. 2017, 126, 168–179. [Google Scholar] [CrossRef]
- Harwin, S.; Lucieer, A. Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from unmanned aerial vehicle (UAV) imagery. Remote Sens. 2012, 4, 1573–1599. [Google Scholar] [CrossRef]
- Tscharf, A.; Rumpler, M.; Fraundorfer, F.; Mayer, G.; Bischof, H. On the use of UAVs in mining and archaeology-geo-accurate 3D reconstructions using various platforms and terrestrial Views. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 2, 15–22. [Google Scholar] [CrossRef]
- Fonstad, M.A.; Dietrich, J.T.; Courville, B.C.; Jensen, J.L.; Carbonneau, P.E. Topographic structure from motion: A new development in photogrammetric measurement. Earth Surf. Process. Landf. 2013, 38, 421–430. [Google Scholar] [CrossRef]
- Uysal, M.; Toprak, A.S.; Polata, N. DEM generation with UAV Photogrammetry and accuracy analysis in Sahitlerhill. Measurement 2015, 73, 539–543. [Google Scholar] [CrossRef]
- Mårtensson, S.G.; Reshetyuk, Y. Height uncertainty in digital terrain modelling with unmanned aircraft systems. Surv. Rev. 2017, 49, 312–318. [Google Scholar] [CrossRef]
- Petrie, G. Current developments in airborne laser scanners suitable for use on lightweight UAVs: Progress is being made! GeoInformatics 2013, 16, 16–22. [Google Scholar]
- Pilarska, M.; Ostrowski, W.; Bakuła, K.; Górski, K.; Kurczyński, Z. The potential of light laser scanners developed for unmanned aerial vehicles-the review and accuracy. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 42, 87–95. [Google Scholar] [CrossRef]
- Amon, P.; Riegl, U.; Rieger, P.; Pfennigbauer, M. UAV-based laser scanning to meet special challenges in lidar surveying. Geomat. Indaba Proc. 2015, 2015, 138–147. [Google Scholar]
- Chaponnière, P.; Allouis, T. The YellowScan Surveyor: 5 cm Accuracy Demonstrated Study Site and Dataset. 2 October 2016. Available online: http://www.microgeo.it/public/userfiles/Droni/YellowScanSurveyor_whitePaper_accuracy.pdf (accessed on 31 March 2018).
- Bakuła, K.; Ostrowski, W.; Szender, M.; Plutecki, W.; Salach, A.; Górski, K. Possibilities for Using LIDAR and Photogrammetric Data Obtained with Unmanned Aerial Vehicle for Levee Monitoring. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 773–780. [Google Scholar] [CrossRef]
- Bakuła, K.; Salach, A.; Zelaya Wziątek, A.; Ostrowski, W.; Górski, K.; Kurczyński, Z. Evaluation of the accuracy of lidar data acquired using a UAS for levee monitoring: Preliminary results. Int. J. Remote Sens. 2017, 38, 2921–2937. [Google Scholar] [CrossRef]
- Pfeifer, N.; Mandlburger, G.; Otepka, J.; Karel, W. OPALS—A framework for Airborne Laser Scanning data analysis. Comput. Environ. Urban Syst. 2014, 45, 125–136. [Google Scholar] [CrossRef]
- Jozkow, G.; Wieczorek, P.; Karpina, M.; Walicka, A.; Borkowski, A. Performance Evaluation of sUAS Equipped with Velodyne HDL-32E LiDAR Sensor. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 171–177. [Google Scholar] [CrossRef]
- Axelsson, P. Processing of laser scanner data—Algorithms and applications. ISPRS J. Photogramm. Remote Sens. 1999, 54, 138–147. [Google Scholar] [CrossRef]
- Kurczynski, Z.; Bakula, K. The selection of aerial laser scanning parameters for countrywide digital elevation model creation. In Proceedings of the International Multidisciplinary Scientific GeoConference: SGEM: Surveying Geology & mining Ecology Management, Albena, Bulgaria, 16–22 June 2013; Volume 2, pp. 695–702. [Google Scholar] [CrossRef]
- Tournadre, V.; Pierrot-Deseilligny, M.; Faure, P.H. UAV linear photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 327–333. [Google Scholar] [CrossRef]
- Turner, D.; Lucieer, A.; de Jong, S.M. Time series analysis of landslide dynamics using an Unmanned Aerial Vehicle (UAV). Remote Sens. 2015, 7, 1736–1757. [Google Scholar] [CrossRef]
- Rehak, M.; Skaloud, J. Fixed-wing micro aerial vehicle for accurate corridor mapping. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, II-1/W1, 23–31. [Google Scholar] [CrossRef]
- McCullagh, M.J. Terrain and surface modelling systems: Theory and practice. Photogramm. Rec. 1988, 12, 747–779. [Google Scholar] [CrossRef]
Name | Hawk Moth |
---|---|
Developer | MSP |
Empty weight | 8.5 kg |
Max. payload | 4 kg |
Average cruise speed | 5 m/s |
Max. cruise speed | 12.5 m/s |
Hovering time | 15 min (with 3 kg payload) |
Sensor equipment | 1. ultralight laser scanner YellowScan Surveyor (Weight: 1.60 kg), Size (mm): 100 × 150 × 140) |
2. RGB camera Sony Alpha a6000 (weight: 0.47 kg)(24.3 MPix APS-C CMOS sensor) |
Number of Flight Mission | Number of Photos | Control Points | Check Points | ||||||
---|---|---|---|---|---|---|---|---|---|
Number of Control Points | RMSX (m) | RMSY (m) | RMSZ (m) | Number of Check Points | RMSX (m) | RMSY (m) | RMSZ (m) | ||
1 | 172 | 11 | 0.076 | 0.121 | 0.106 | 4 | 0.084 | 0.110 | 0.122 |
2 | 173 | 10 | 0.055 | 0.031 | 0.077 | 5 | 0.075 | 0.028 | 0.082 |
3 | 153 | 10 | 0.078 | 0.042 | 0.016 | 5 | 0.067 | 0.060 | 0.067 |
4 | 149 | 8 | 0.030 | 0.032 | 0.027 | 5 | 0.028 | 0.037 | 0.072 |
5 | 168 | 8 | 0.034 | 0.031 | 0.054 | 4 | 0.052 | 0.048 | 0.083 |
6 | 176 | 11 | 0.049 | 0.058 | 0.072 | 4 | 0.025 | 0.055 | 0.076 |
7 | 157 | 8 | 0.118 | 0.042 | 0.026 | 4 | 0.088 | 0.054 | 0.064 |
8 | 137 | 7 | 0.047 | 0.031 | 0.093 | 4 | 0.027 | 0.082 | 0.101 |
Vegetation Height levels | Number of Check Points | DTMDIM 25 | DTMDIM 50 | DTMULS 25 | DTMULS 50 | DTMALS 50 | |||||
---|---|---|---|---|---|---|---|---|---|---|---|
Mean (m) | RMSEZ (m) | Mean (m) | RMSEZ (m) | Mean (m) | RMSEZ (m) | Mean (m) | RMSEZ (m) | Mean (m) | RMSEZ (m) | ||
I. 0–20 cm | 60 | 0.062 | 0.139 | 0.064 | 0.144 | 0.005 | 0.109 | 0.005 | 0.121 | 0.091 | 0.115 |
II. 20–40 cm | 91 | 0.181 | 0.242 | 0.177 | 0.242 | 0.037 | 0.139 | 0.035 | 0.145 | ||
III. 40–60 cm | 31 | 0.205 | 0.271 | 0.248 | 0.326 | 0.008 | 0.109 | 0.009 | 0.135 | ||
IV. >60 cm | 11 | 0.225 | 0.361 | 0.402 | 0.442 | 0.013 | 0.110 | 0.023 | 0.110 |
Vegetation Height Levels | Area (ha) | DTM DIM 25 | DTM DIM 50 | DTM ULS 25 | DTM ULS 50 | ||||
---|---|---|---|---|---|---|---|---|---|
Mean (m) | STD σ (m) | Mean (m) | STD σ (m) | Mean (m) | STD σ (m) | Mean (m) | STD σ (m) | ||
I. 0–20 cm | 0.933 | −0.056 | 0.305 | −0.005 | 0.279 | −0.018 | 0.145 | −0.065 | 0.220 |
II. 20–40 cm | 13.900 | 0.039 | 0.230 | 0.088 | 0.338 | −0.065 | 0.121 | −0.047 | 0.299 |
III. 40–60 cm | 18.616 | 0.138 | 0.242 | 0.190 | 0.274 | −0.071 | 0.160 | −0.088 | 0.163 |
IV. >60 cm | 16.067 | 0.274 | 0.407 | 0.292 | 0.378 | −0.041 | 0.291 | −0.104 | 0.277 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Salach, A.; Bakuła, K.; Pilarska, M.; Ostrowski, W.; Górski, K.; Kurczyński, Z. Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation. ISPRS Int. J. Geo-Inf. 2018, 7, 342. https://doi.org/10.3390/ijgi7090342
Salach A, Bakuła K, Pilarska M, Ostrowski W, Górski K, Kurczyński Z. Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation. ISPRS International Journal of Geo-Information. 2018; 7(9):342. https://doi.org/10.3390/ijgi7090342
Chicago/Turabian StyleSalach, Adam, Krzysztof Bakuła, Magdalena Pilarska, Wojciech Ostrowski, Konrad Górski, and Zdzisław Kurczyński. 2018. "Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation" ISPRS International Journal of Geo-Information 7, no. 9: 342. https://doi.org/10.3390/ijgi7090342
APA StyleSalach, A., Bakuła, K., Pilarska, M., Ostrowski, W., Górski, K., & Kurczyński, Z. (2018). Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation. ISPRS International Journal of Geo-Information, 7(9), 342. https://doi.org/10.3390/ijgi7090342