Next Article in Journal
Autonomous Lunar Rover Localization while Fully Scanning a Bounded Obstacle-Rich Workspace
Previous Article in Journal
Acoustoelectric Effect due to an In-Depth Inhomogeneous Conductivity Change in ZnO/Fused Silica Substrates
Previous Article in Special Issue
A UAV Thermal Imaging Format Conversion System and Its Application in Mosaic Surface Microthermal Environment Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

FaSS-MVS: Fast Multi-View Stereo with Surface-Aware Semi-Global Matching from UAV-Borne Monocular Imagery

1
Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB, 76131 Karlsruhe, Germany
2
Institute of Photogrammetry and Remote Sensing, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(19), 6397; https://doi.org/10.3390/s24196397
Submission received: 29 August 2024 / Revised: 25 September 2024 / Accepted: 29 September 2024 / Published: 2 October 2024
(This article belongs to the Special Issue Advances on UAV-Based Sensing and Imaging)

Abstract

With FaSS-MVS, we present a fast, surface-aware semi-global optimization approach for multi-view stereo that allows for rapid depth and normal map estimation from monocular aerial video data captured by *UAV. The data estimated by FaSS-MVS, in turn, facilitate online 3D mapping, meaning that a 3D map of the scene is immediately and incrementally generated as the image data are acquired or being received. FaSS-MVS is composed of a hierarchical processing scheme in which depth and normal data, as well as corresponding confidence scores, are estimated in a coarse-to-fine manner, allowing efficient processing of large scene depths, such as those inherent in oblique images acquired by *UAV flying at low altitudes. The actual depth estimation uses a plane-sweep algorithm for dense multi-image matching to produce depth hypotheses from which the actual depth map is extracted by means of a surface-aware semi-global optimization, reducing the fronto-parallel bias of Semi-Global Matching (SGM). Given the estimated depth map, the pixel-wise surface normal information is then computed by reprojecting the depth map into a point cloud and computing the normal vectors within a confined local neighborhood. In a thorough quantitative and ablative study, we show that the accuracy of the 3D information computed by FaSS-MVS is close to that of state-of-the-art offline multi-view stereo approaches, with the error not even an order of magnitude higher than that of COLMAP. At the same time, however, the average runtime of FaSS-MVS for estimating a single depth and normal map is less than 14 % of that of COLMAP, allowing us to perform online and incremental processing of full HD images at 1–2 Hz.
Keywords: multi-view stereo; plane-sweep multi-image matching; semi-global optimization; surface-awareness; online processing; oblique aerial imagery; UAVs multi-view stereo; plane-sweep multi-image matching; semi-global optimization; surface-awareness; online processing; oblique aerial imagery; UAVs

Share and Cite

MDPI and ACS Style

Ruf, B.; Weinmann, M.; Hinz, S. FaSS-MVS: Fast Multi-View Stereo with Surface-Aware Semi-Global Matching from UAV-Borne Monocular Imagery. Sensors 2024, 24, 6397. https://doi.org/10.3390/s24196397

AMA Style

Ruf B, Weinmann M, Hinz S. FaSS-MVS: Fast Multi-View Stereo with Surface-Aware Semi-Global Matching from UAV-Borne Monocular Imagery. Sensors. 2024; 24(19):6397. https://doi.org/10.3390/s24196397

Chicago/Turabian Style

Ruf, Boitumelo, Martin Weinmann, and Stefan Hinz. 2024. "FaSS-MVS: Fast Multi-View Stereo with Surface-Aware Semi-Global Matching from UAV-Borne Monocular Imagery" Sensors 24, no. 19: 6397. https://doi.org/10.3390/s24196397

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop