Next Article in Journal
Bio-Inspired Mechano-Sensor Based on the Deformation of Slit Wake
Previous Article in Journal
A Systematic Approach to Healthcare Knowledge Management Systems in the Era of Big Data and Artificial Intelligence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and Validation of Unmanned Aerial Vehicle Photogrammetry Simulator for Shaded Area Detection

1
Department of Civil and Environmental Engineering, Sungkyunkwan University, Suwon 16419, Korea
2
Institute of Geographic & Environmental Technology, GEOMEXSOFT Co., Ltd., Chunchoen 24461, Korea
3
Interdisciplinary Program in Crisis, Disaster and Risk Management, Sungkyunkwan University, Suwon 16419, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4454; https://doi.org/10.3390/app12094454
Submission received: 23 February 2022 / Revised: 16 April 2022 / Accepted: 26 April 2022 / Published: 28 April 2022

Abstract

:
In unmanned aerial vehicle (UAV) photogrammetry, the qualities of three-dimensional (3D) models, including ground sample distance (GSD) and shaded areas, are strongly affected by flight planning. However, during flight planning, the quality of the output cannot be estimated, as it depends on the experience of the operator. Therefore, to reduce the time and cost incurred by repetitive work required to obtain satisfactory quality, a simulator, which can automatically generate a route, acquire images through simulation, and analyze the shaded areas without real flight, has been required. While some simulators have been developed, there are some limitations. Furthermore, evaluating the performance of the simulator is difficult owing to the lack of a validation method. Therefore, to overcome the limitations, target functions, which can plan flights and can detect shaded areas, were set, developed, and validated in this study. As a result, a simulator successfully planned a flight and detected shaded areas. In this way, the simulator was validated to determine the applicability of its performance. Furthermore, the outputs of this study can be applied to not only UAV photogrammetry simulators but also other 3D modeling simulators.

1. Introduction

1.1. Background

The introduction of virtual reality and augmented reality technologies has increased the demand for three-dimensional (3D) models [1,2,3,4,5,6,7,8]. Among the various 3D modeling techniques, unmanned aerial vehicle (UAV) photogrammetry has the advantages of time and cost efficiency [1,4,5,6,7,8,9,10,11,12,13].
The 3D modeling procedure using UAVs can be divided into the following three main parts: Flight planning, flight and image acquisition, and image processing. Several studies on each process have been conducted to obtain a low-cost, high-precision 3D model [1,4,14,15]. Studies on optimal path planning to obtain a high-quality 3D model have been conducted in the flight planning section [16,17,18,19,20,21,22]. For image acquisition, optimal imaging techniques have been developed [4,11,23,24]. Furthermore, concerning image processing, studies have been conducted to improve the image registration accuracy [11,25,26,27].
In the process of 3D modeling using UAVs, the efficiency of both image acquisition and image processing closely depends on the flight planning stage, including path planning [11,18,22]. The optimal flight path of a UAV varies depending on the object’s structural characteristics and the surrounding environment [1,19,20,21,22]. As a result, the optimal flight path cannot be applied to all target objects, and path planning remains dependent on the operators’ skill level. While operators cannot check the quality of the 3D model during the flight path-planning process, they can create a 3D model through image processing after acquiring an image along the planned flight path. If the quality of the 3D model does not satisfy the requirements, the operator should repeat the same process of flight planning, aerial image acquisition, and image processing until the model quality is satisfactory. This entails the repeated expenditure of time and money. A simulator that allows the operator to estimate the quality of the model in advance during the planning stage is required to reduce these unnecessary expenses.
While various simulators have been developed for UAV aerial photogrammetry, they still have some limits. Some, for example, support only nadir photogrammetry above the object, but not oblique photogrammetry alongside the object [28,29]. Others can plan the path automatically but cannot check the quality of the outcome without real flight [16,29,30,31,32]. Most of the simulators are in this category, such as Pix4D, Map Pilot, Atlas Flight, Tap2Map, UgCS, Mission Planner, APM Planner as well as MAVProxy. These simulators are incapable of detecting shaded areas or need a real flight to detect shaded areas. Some are limited in their ability to secure exact spatial objects for simulation [4,33,34]. DJI GS Pro and Drone Harmony are in this category. Therefore, it is difficult to find a simulator that can automatically design the path and check the quality of the 3D model from the planning step without actual flight [35].
To overcome these limitations, Kim et al. developed a simulator that can plan the flight path and check the shading area of a 3D model in advance [36,37]. The simulator can check the quality of the 3D model before flying and supports both nadir and oblique shooting. Its efficacy has also been reported in a previous study [37]. This study focused on the developed simulator’s operational algorithm and the implementation of its functions. However, the validation was limited to its implementation; an examination of the algorithm, a mathematical model that serves as the simulation’s foundation, or its proper operation was not included. Therefore, the performance of the simulator was not evaluated.
As a result, despite the increasing demand for 3D modeling and need for UAV photogrammetry for efficient 3D modeling, the lack of development and verification of an appropriate simulator greatly hinders the utilization and advancement of UAV photogrammetry. Therefore, in this study, we developed and validated a UAV photogrammetry simulator that can calculate modeling resources, detect shaded areas, and estimate the quality of the output.

1.2. Objectives and Scope

The purpose of this study is to develop and validate a UAV photogrammetry simulator for 3D modeling. The specific objectives of this study are the development of a UAV simulator for 3D modeling and validation of the developed simulator. The scope of this study does not include verifying the simulator’s efficiency.

2. Materials and Methods

To develop and validate the simulator, it is necessary to clearly define the functions to be established, validation items, and the process that can represent the performance. Furthermore, the performance indicators for validation should be valid and objective. Figure 1 shows the procedure used in this study.

2.1. Development of UAV Photogrammetry Simulator for 3D Modeling

2.1.1. Target Function

To develop the simulator, target functions were set as the requirements, and the user interface and user experience were designed. The main purpose of the simulator is to develop an optimal flight plan. To achieve this objective, the simulator should correctly choose waypoints to meet quality indicators, such as the ground sample distance (GSD) and overlap, and should make the operator check the output quality. Table 1 demonstrates the essential functions of the simulator and details of each function.

2.1.2. Architecture

The basic structure of the simulator follows the one reported by Kim et al. [36,37] because it is the most recently developed simulator and the only one designed specifically for shaded area detection. The system architecture was designed as shown in Figure 2. It consists of a component to use the map service of the portal, database storing information on other spatial information, actual UAV hardware or software in the loop (SITL), and a simulator.
The portal map service module refers to the map service provided by portals, such as V-world, Naver, and Daum in the form of a tile map. The corresponding map service is called through the Portal Map Connector. The database stores spatial information, including two-dimensional (2D) geographic information system (GIS) data, raster data, 3D model data, and metadata, which are not provided by the portal services, to manage the data. The simulator can be applied to an actual or virtual UAV. When applied to an actual UAV, the simulator transmits route information to the UAV using the Micro Air Vehicle Link (MAVLink) protocol. The UAV moves along the route information and sends a log recording the actual photo-taking progress to the simulator. In the case of using a virtual UAV through SITL, the route is transmitted through the MAVLink protocol and the recording log is also transmitted. However, because this is performed through a virtual UAV rather than a real one, a MAVLink converter is used instead of the MAVLink interface.
The simulator uses the portal map connector to import spatial information from the portal and the XDO converter to convert V-world XDO files. The GIS data imported from the portal and those from the database are displayed on the screen in 2D and 3D form using the 2D map viewer and 3D map viewer. Before proceeding with flight simulation, the flight area of interest is set through GIS data management, and the UAV profile is applied through the UAV profile manager. The actual UAV flight is performed by the waypoint manager, and the UAV is simulated by the SITL connector and the Ground Control Station Simulator (GCS-S) executer. The simulator can detect the shaded area through GCS-S analysis.

2.1.3. Development Environment

A development environment was established for functional development. Because the simulator developed in this study is based on the Windows operating system, the development environment was established in Windows 10 pro, Microsoft, U.S.
Open Scene Graph was used as a tool to visualize 3D spatial objects, and Potree was used for point clouds. Open Scene Graph is an open-source, high-performance 3D toolkit used by application developers in fields, such as visual simulation, games, virtual reality, scientific visualization, and 3D modeling. It is developed using standard C++ and OpenGL and works on all Windows platforms and other platforms. The simulator developed in this study was created with C++, Delphi, and OpenGL.

2.1.4. Automatic Path Planning for Flight

The simulator automatically creates a path to fulfill output qualities such as GSD, forward overlap, side overlap, and camera tilt with respect to focal length, sensor size, and image size. Figure 3 illustrates the process of automatic path planning and the parameters used in nadir and oblique photography. The path planning process comprises creating a convex polygon, determining the distance from the object, determining the course, and determining the shooting waypoints. The path design requires the following inputs: Object model, GSD, focal length, sensor size, image size, forward overlap, side overlap, and camera tilt.
In the creating a convex polygon step, the built polygon is made convex by filling the concave part of the 3D model. UAVs find it difficult to access the concave parts, and a risk of accidents exists. Thus, the polygon is converted to convex to prevent accidents in real flight.
Figure 4a illustrates the flight path in nadir photography; the nadir path follows the zigzag shape on a plane with a certain distance D from the top of the object. Figure 4b,c demonstrate the oblique path that has a buffer distance D from the convex polygon of the object while moving floor by floor. For both nadir and oblique photography, Dm is the distance between the photo-taking points within the same strip, and Ds is the distance between the strips.
The distance between the object and UAV is calculated to determine the distance from the object step using focal length, sensor size, and image size. The distance D is given by Equation (1), where s is the length of the sensor, G is the GSD, f is the focal length, and p is the number of pixels.
D = G · f · p s
In the course setting stage, the interval between two consecutive strips is determined. The strip direction in nadir photography is the same as the UAV’s yaw direction. Therefore, the x-axis component of the camera, the width of the sensor (sw), the focal length in the x direction (fx), the distance (D) from the object, and the side overlap (as) are used. The distance (Ds) between strips in the nadir is given by Equation (2).
D s = D s w f x ( 1 a s )
In oblique photography, the strip direction is perpendicular to the UAV’s yaw direction. Thus, the y-axis component of the camera, the height of the sensor (sh), the focal length in the y direction (fy), the distance (D) from the object, and the side overlap (as) are used. The distance (Ds) between strips in the object is given by Equation (3).
D s = D s h f y ( 1 a s )
If visual line of sight (VLOS) flight should be applied, the minimum height (Hs) from the land surface can be set to secure the sight. In this case, the minimum height (Hs) can be higher than the object height.
In the waypoints stage, the waypoint is determined within the strip according to the set forward overlap, which is perpendicular to the side overlap. The UAV takes images according to the set interval (Dm) along the strip. In nadir photography, the height of the sensor (sh), the focal length in the y direction (fy), the distance (D) from the object, and the forward overlap (af) are used to determine the interval (Dm) between waypoints along the strip, as given by Equation (4).
D m = D s h f y ( 1 a f )
Similarly, in oblique photography, the width of the sensor (sw), the focal length in the x direction (fx), the distance (D) from the object, and the forward overlap (af) are used to determine the interval (Dm) between waypoints along the strip, as given by Equation (5).
D m = D s w f x ( 1 a f )

2.2. Validation Methodology

2.2.1. Validation Procedure

Before the validation, validation items that can demonstrate the performance of the simulator and be proven objectively were set. While all target functions operate correctly as designed, some functions related to the numerical outputs should be validated in terms of accuracy. To validate the functions, indicators are set by each function, as listed in Table 2. The functions to be assessed are path planning, flight simulation, and shaded area detection because these are closely related to the main purpose of the simulator and are crucial in flight planning.
Figure 5 shows the validation process, which is conducted by comparing real and virtual flight data and comparing the estimated and real outputs. Because flight simulation and shading area detection can be performed after path planning, validation of the flight simulation and shading area detection were conducted after validation of the path planning. The performance validation of each function can be quantified using the corresponding performance indicators. Each indicator is designed to be easy for users to understand.

2.2.2. Validation of Path Planning

As the path is planned after setting the GSD and overlap as the targets, the image obtained during real flight using the designed waypoints must satisfy them. To verify this, the GSD and overlap are investigated.
The GSD of the image captured in real flight was compared with that set as the target. Because of the characteristics of the object, depth exaggeration may occur, causing the GSD to change. To remove this exaggeration, an area of interest (AOI) was defined in the object and the GSD for the area was calculated. GSD was computed in both the vertical and horizontal directions because the GSD of the horizontal axis and that of the vertical axis are different.
Figure 6 illustrates line segment lk of length Lk captured. If lk has a length of pxk in the horizontal direction and pyk in the vertical direction in the image, as shown in Figure 6, and the GSDs in the horizontal and vertical directions are Gx and Gy, respectively, then the relationship between them is
G x 2 · p x k 2 + G y 2 · p y k 2 = L k 2  
If the width and height of the sensor are sw and sh, respectively, and the pixel width and height of the image are pw and ph, respectively, then the relationship between Gx and Gy is given by Equation (7). From Equations (6) and (7), the GSD can be calculated in each direction.
G y = s h p h p w s w G x  
The overlap in the captured image during real flight is compared with the overlap set as the target. As in the GSD examination, owing to the characteristics of the object, depth exaggeration may occur and change the overlap. To remove the exaggeration, an AOI was placed in the object and the overlap for the area was analyzed. In this study, the term “overlap” refers to both forward and side overlap. As a result, the examination was conducted for two types of overlap.

2.2.3. Validation of Flight Simulation

In the validation of the simulation, the coincidence between one image captured by the simulator and another captured by the actual flight is evaluated. This indicates the accuracy of the acquired image in advance without actual flight in the field. Figure 7 shows the measures used to estimate the coincidence.
In Figure 7, Is is an image obtained by the simulator, while Ir is an image obtained by real flight. Both Is and Ir were captured at the same coordinates in the simulator and during the actual flight. The sizes of the common area and Ir in square pixels are Ac and Ar, respectively. The image coincidence rate Rc can be calculated using Ac and Ar after image matching and transformation using the feature points in the common AOI.
R c = A c A r

2.2.4. Validation of Shaded Area Detection

The shaded area detection was validated by comparing the shaded area calculated from the actual flight with that estimated by the simulator. The validation results were obtained by comparing the sizes of the shaded area in the actual and virtual images. The total surface area of the object ( s t o t a l ), common shaded area ( s c ), shaded area determined by the simulator ( s s ), and shaded area determined by the actual flight ( s r ) were used to calculate type-I and -II errors. The relationships between these indicators are depicted in Figure 8. Based on the area, the shaded area detection can be validated statistically according to the equations provided in Table 3.

2.3. Materials

The simulator, testbed, scenario, and aerial imaging equipment are the main components of the preliminary works and materials for the test. The simulator is the subject of the validation, while the testbed, scenario, and aerial imaging equipment serve as the experimental environment.

2.3.1. Testbeds

To validate the performance of the simulator, the testbed should satisfy the following conditions:
  • It must be in an area where aerial photogrammetry is feasible and simple to perform
  • It must not be located in a prohibited or restricted zone
  • An appropriate shading area should be located within the testbed
  • It should be easy to obtain spatial information, such as drawings and 3D models, of the testbed
The Geomexsoft office building (Figure 9a) and The Research Institute for Gangwon building (Figure 9c) satisfy all the conditions; therefore, they were selected as testbeds. The area surrounding Seo-myeon, Chuncheon-si, and Gangwon-do, where the targets are located, is unrestricted for UAV photogrammetry. For validation, the testbed boundaries of the Geomexsoft office building (Figure 9b) and the Research Institute for Gangwon building (Figure 9d) were obtained. To validate path planning, flight simulation, and shaded area detection, the Geomexsoft office building was used. Similarly, to validate shaded area detection, the Research Institute for Gang-won building was used.

2.3.2. UAV, Digital Camera, and GCS

The simulator used in this study can export the flight waypoints in the mission format of the mission planner or JavaScript object notation waypoint format of DJI. As a result, a UAV and GCS that support these file types are required. To obtain nadir and oblique images, the camera must be tiltable. As a result, the DJI Phantom 4 Pro was used in this study. The specifications of the camera from web page [38] play an important role in determining the flight path and are listed in Table 4. Aerial photogrammetry should follow a path that is automatically generated by the simulator. Therefore, the GCS is prepared only for emergencies, and does not perform any controls that affect the experimental results during the flight and acquisition of images.

2.3.3. Scenarios

Two scenarios were set to validate the simulator. One scenario was created for verification and was run by adjusting the AOI, GSD, and overlap. The AOI for simulation and performance validation is the Geomexsoft building which is one of the testbeds in Figure 9a,b. During the flight, the AOI was easily discernible. GSD was chosen with the surrounding environment in mind, and the overlap was optimized for aerial photogrammetry for 3D modeling [11,23,24]. Table 5 lists the specified GSDs and overlaps.
The second scenario was created to validate shaded area detection, comparing real flight and virtual flight regardless of the simulator’s own path planning. In this scenario, the flight path was automatically created by Pix4D using AOI, which is the Research Institute for the Gangwon building (Figure 9c,d). The simulator imported the flight path from Pix4D and detected shaded areas according to the path and flight environments.

3. Results

The validation tests were conducted in the testbed on 14 April 2021 and 25 September 2021 around the Research Institute for Gangwon building and the Geomexsoft office building, respectively.
Figure 10 illustrates automatically generated waypoints and actual waypoints for both scenarios. Both nadir and oblique photography were considered in the first scenario. Alternatively, only oblique photography was considered in the second scenario to emphasize the validation of shaded area detection.

3.1. GSD Accuracy

Figure 11 illustrates part of the images used to analyze the GSD. Table 6 displays the designed value, mean horizontal and vertical GSD, arithmetic mean of the GSD in both directions, errors for each value, and standard deviations for each value.

3.2. Overlap Accuracy

Figure 12 shows part of the image registration performed to obtain the overlap. Image registration was performed from multiple control points, and the calculated overlap in both directions is shown in Table 7.

3.3. Image Coincidence Rate

Figure 13 demonstrates a sample of the image registration and coincidence rate calculation. Table 8 shows the coincidence results for the images.

3.4. Shaded Area Detection Accuracy

Shaded area detection was analyzed after classification into four areas, as indicated in Figure 8 and Table 3. The first area ( s t o t a l s s s r + s c ) is the common unshaded area observed in both the simulator and actual flight (Figure 14a). The second ( s c ) is the common shaded area (Figure 14b). The third ( s s s c ) is the area determined as shaded in the simulator but not in the actual flight (Figure 14c). The final area ( s r s c ) is an unshaded area in the simulator but a shaded area in flight (Figure 14d).
In the first scenario, the true-positive area, s c , is 51.36 m2; the true-negative area is 2151.49 m2; the false-positive area is 39.60 m2; and the false-negative area is 85.68 m2 (Table 9). Table 10 shows the index used to validate the shaded area detection.
Table 9 shows the second scenario in which the true-positive area ( s c ) is 242.72 m2, the true-negative area is 9190.19 m2, the false-positive area is 137.25 m2, and the false-negative area is 363.52 m2. Table 10 shows the index used for validating the shaded area detection.

4. Discussion

4.1. Evaluation of Path Planning

The path is 3D and its constituent elements can be divided into 2D planar elements and a one-dimensional (1D) elevation element. A 2D plane element can be evaluated through a set overlap, and the 1D height element can be evaluated through a set GSD.
The analyzed GSD is close to the target value of 2 mm. The horizontal and vertical dimensions of the GSD may differ depending on the size of the electronics and the number of pixels in each direction. The GSD in the horizontal direction had an error of only 0.1 mm, as shown in Table 6, and its error rate was 7.6%. Conversely, the vertical GSD was 1.4 times greater than the horizontal GSD, and its error rate was 27.8%. The simulator sets the height based on the horizontal GSD; therefore, the error arose from the difference in the physical pixel size between the two directions.
There was a slight difference between the forward and side overlaps. The error was small in the longitudinal direction and large in the transverse direction (Table 7).

4.2. Evaluation of Flight Simulation

The purpose of the simulator is to check the images captured in advance without the cost of actual flight. According to this viewpoint, the difference between the actual and simulation images has a negative impact on the simulator’s validation. Because the shooting plan calls for an 80% forward overlap and a 60% side overlap, the overlap should be greater than 48% if the camera is directed correctly. In this experiment, the image coincidence reached 86%, and it was at least 83.7% within the 95% confidence interval (Table 8). However, in some images, there were angle differences between the simulation and real images, as shown in Figure 13, which should be improved.

4.3. Evaluation of Shaded Area Detection

The shaded area detection is an important function that allows the quality of 3D models to be estimated in advance without an actual flight. The validation results demonstrate that the accuracy was more than 0.94 in both scenarios, which is an extremely high value. The precision was no less than 0.565, which was higher than the recall (0.375 and 0.405) in both cases. This means that there is a high likelihood that an area depicted as a shaded area in the simulator will also appear as an actual shaded area, whereas the likelihood of an actual shaded area being detected in the simulator is low. Therefore, there is little possibility that the shaded area would not be detected by the simulator.

5. Conclusions

This study covers the development and the performance validation of a simulator that can detect shaded areas for 3D modeling using a UAV.
Specifically, a simulator was developed and validated for path planning, flight simulation, and shaded area detection. It was determined that path planning operates in accordance with the target GSD and overlap. Furthermore, it was confirmed that the images obtained through flight simulation were substantially consistent with the images obtained during the actual flight. Finally, it was demonstrated that the shaded area detection function can be used to estimate the quality of the actual product.
Nevertheless, this study suffered from several limitations. First, the validation procedure in this study is not independent of the UAV positioning. Second, to demonstrate the reliability of the simulator developed in this study, it should be applied to various UAVs, testbeds, and scenarios. Therefore, future research should also be conducted to overcome these limitations.
This study has a significant potential impact in its development and application of a novel methodology to develop and validate the simulator. Finding photogrammetry simulators for UAV 3D modeling with shaded area detection is currently difficult. However, because of the benefits of UAV aerial surveying and the demand for 3D modeling, simulation technology is expected to advance significantly. Therefore, it is necessary to develop and validate various simulators in the future. The findings herein are expected to have substantial influence in this regard.

Author Contributions

Conceptualization, J.K. (Jaekoo Kim), K.J. and J.L. (Jaejoon Lee); methodology, J.K. (Jisung Kim) and J.L. (Jaejoon Lee); software, J.K. (Jaekoo Kim); validation, J.K. (Jisung Kim), J.L. (Joonmin Lee); formal analysis, J.K. (Jisung Kim); investigation, J.K. (Jisung Kim); resources, J.L. (Joonmin Lee); data curation, J.K. (Jisung Kim); writing—original draft preparation, J.K. (Jisung Kim); writing—review and editing, J.L. (Jaejoon Lee); visualization, J.K. (Jisung Kim); supervision, K.J.; project administration, J.L. (Jaejoon Lee); funding acquisition, J.K. (Jaekoo Kim) and K.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research and APC were funded by the Ministry of Land, Infrastructure and Transport (MOLIT, Korea), grant number 2021-DRMS-B147287-04.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This research was supported by a grant (2021-DRMS-B147287-04) of “the Development of customized realistic 3D geospatial information update and utilization technology based on consumer demand” funded by the Ministry of Land, Infrastructure and Transport (MOLIT, Korea).

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Zhang, X.; Zhao, P.; Hu, Q.; Ai, M.; Hu, D.; Li, J. A UAV-based Panoramic Oblique Photogrammetry (POP) Approach using Spherical Projection. ISPRS J. Photogramm. Remote Sens. 2020, 159, 189–219. [Google Scholar] [CrossRef]
  2. Li, B.; Qi, H.; Jiang, W.; Fan, M.; Bi, H.; Yang, S.; Xu, Y.; Yu, H. Large-scale 3D Modeling Based on UAV Technology. In Proceedings of the 2020 International Conference on Virtual Reality and Visualization (ICVRV), Recife, Brazil, 13–14 November 2000; pp. 113–116. [Google Scholar]
  3. Tadeja, S.K.; Rydlewicz, W.; Lu, Y.; Kristensson, P.O.; Bubas, T.; Rydlewicz, M. Preprint-PhotoTwinVR: An Immersive System for Manipulation, Inspection and Dimension Measurements of the 3D Photogrammetric Models of Real-Life Structures in Virtual Reality. arXiv 2019, arXiv:1911.09958. [Google Scholar]
  4. Nakama, J.; Parada, R.; Matos-Carvalho, J.P.; Azevedo, F.; Pedro, D.; Campos, L. Autonomous Environment Generator for UAV-Based Simulation. Appl. Sci. 2021, 11, 2185. [Google Scholar] [CrossRef]
  5. Dong, X.; Kim, W.; Lee, K. Drone-Based Three-Dimensional Photogrammetry and Concave Hull by Slices Algorithm for Apple Tree Volume Mapping. J. Biosyst. Eng. 2021, 46, 474–484. [Google Scholar] [CrossRef]
  6. Liu, Y.; Zheng, X.; Ai, G.; Zhang, Y.; Zuo, Y. Generating a High-Precision True Digital Orthophoto Map Based on UAV Images. Int. J. Geo-Inf. 2018, 7, 333. [Google Scholar] [CrossRef] [Green Version]
  7. Park, K.; Ham, S.; Lee, I. Automatic Georeferencing of Close-Range Façade Images Acquired in an Narrow and Long Alleyway Using RTK Drone Images; XLIII-B2-2020, XXIV ISPRS Congress (2020 Edition); International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences: Nice, France, 2020; pp. 63–67. [Google Scholar]
  8. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV Photogrammetry for Mapping and 3D Modeling–Current Status and Future Perspectives. In Proceedings of the 2011 ISPRS Zurich 2011 Workshop, Zurich, Switzerland, 14–16 September 2011; XXXVIII-1/C22. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2011. [Google Scholar]
  9. Koch, T.; Korner, M.; Fraundorfer, F. Automatic and Semantically-Aware 3D UAV Flight Planning for Image-Based 3D Reconstruction. Remote Sens. 2019, 11, 1550. [Google Scholar] [CrossRef] [Green Version]
  10. Anifantis, A.S.; Camposeo, S.; Vivaldi, G.A.; Santoro, F.; Pascuzzi, S. Comparison of UAV Photogrammetry and 3D Modeling Techniques with Other Currently Used Methods for Estimation of the Tree Row Volume of a Super-High-Density Olive Orchard. Agriculture 2019, 9, 233. [Google Scholar] [CrossRef] [Green Version]
  11. Seo, S.I.; Park, B.W.; Lee, B.K.; Kim, J.I. Generation of Mosaic Image using Aerial Oblique Images. J. Korean Soc. Geospat. Inf. Syst. 2014, 22, 145–154. [Google Scholar]
  12. Hastaoglu, K.O.; Gul, Y.; Poyraz, F.; Kara, B.C. Monitoring 3D Areal Displacements by a New Methodology and Software using UAV Photogrammetry. Int. J. Appl. Earth Obs. Geoinf. 2019, 83, 101916. [Google Scholar] [CrossRef]
  13. Nex, F.; Remondino, F. UAV for 3D Mapping Applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  14. Barrile, V.; Bilotta, G.; Nunnari, A. 3D Modeling with Photogrammetry by UAVs and Model Quality Verification. In 4th International GeoAdvances Workshop; ISPRS: Karabuk, Turkey, 2017; pp. 129–134. [Google Scholar]
  15. Vacca, G.; Dessi, A.; Sacco, A. The Use of Nadir and Oblique UAV Images for Building Knowledge. ISPRS Int. J. Geo-Inf. 2017, 5, 393. [Google Scholar] [CrossRef] [Green Version]
  16. Fu, Z.; Yu, J.; Xie, G.; Chen, Y.; Mao, Y. A heuristic Evolutionary Algoritm of UAV Path Planning. Wirel. Commun. Mob. Comput. 2018, 2018, 2851964. [Google Scholar] [CrossRef] [Green Version]
  17. Battulwar, R.; Winkelmaier, G.; Valencia, J.; Naghadehi, M.Z.; Peik, B.; Abbasi, B.; Parvin, B.; Sattarvand, J. A Practical Methodology for Generating High-Resolution 3D Models of Open-Pit Slopes Using UAVs: Flight Path Planning and Optimization. Remote Sens. 2020, 12, 2283. [Google Scholar] [CrossRef]
  18. Glowacki, D.; Hajduk, J.; Rodzewicz, M. Methods of Flight-path Planning for UAV Photogrammetry Missions with Consideration of Aircraft Dynamic Properties, Challenges in European Aerospace. In Proceedings of the 5th CEAS Air&Space Conference “Challenges in European Aerospace”, Delft, NL, USA, 7–11 September 2015. No. 23. [Google Scholar]
  19. Trajkovski, K.K.; Grigillo, D.; Petrovic, D. Optimization of UAV Flight Missions in Steep Terrain. Remote Sens. 2020, 12, 1293. [Google Scholar] [CrossRef] [Green Version]
  20. Hu, X.; Pang, B.; Dai, F.; Low, K.H. Risk Assessment Model for UAV Cost-Effective Path Planning in Urban Environments. IEEE Access 2020, 8, 150162–150173. [Google Scholar] [CrossRef]
  21. He, Z.; Zhao, L. The Comparison of Four UAV Path Planning Algorithms Based on Geometry Search Algorithm. In Proceedings of the 2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 26–27 August 2017; pp. 33–36. [Google Scholar]
  22. Dai, R.; Fotedar, S.; Radmanesh, M.; Kumar, M. Quality-aware UAV Coverage and Path Planning in Geometrically Complex Environments. Ad Hoc Netw. 2018, 73, 95–105. [Google Scholar] [CrossRef]
  23. Cho, J.; Lee, J.; Lee, B. A Study on the Optimal Shooting Conditions of UAV for 3D Production and Orthophoto Generation. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2020, 38, 645–653. [Google Scholar]
  24. Martin, R.A.; Blackburn, L.; Pulsipher, J.; Franke, K.; Hedengren, J.D. Potential Benefits of Combining Anomaly Detection with View Planing for UAV Infrasttructure Modeling. Remote Sens. 2017, 9, 434. [Google Scholar] [CrossRef] [Green Version]
  25. Martinez-Carricondo, P.; Aguera-Vega, F.; Carvajial-Ramires, F.; Meas-Carrascosa, F.; Garcia-Ferrer, A.; Perez-Porras, F. Assessment of UAV-photogrammetric Mapping Accuracy Based on Variation of Ground Control Points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
  26. Zhou, Q.; Liu, J. Automatic Orthorectification and Mosaicking of Oblique Images from a Zoom Lens Aerial Camera. Opt. Eng. 2015, 54, 013104. [Google Scholar] [CrossRef]
  27. Shahbazi, M.; Sohn, G.; Theau, J.; Menard, P. Development and Evaluation of a UAV-Photogrammetry System for Precise 3D Environmental Modeling. Sensors 2015, 15, 27493–27524. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Lu, P.; Geng, Q. Real-time Simulation System for UAV Based on Mathlab/Simulink. In Proceedings of the 2011 IEEE 2nd International Conference on Computing, Control and Industrial Engineering, Wuhan, China, 20–21 August 2011; pp. 399–404. [Google Scholar]
  29. Jingsha, Z.; Qingbo, G.; Qing, F. UAV Flight Control System Modeling and Simulation Based on FlightGear. In Proceedings of the International Conference on Automatic Control and Artificial Intelligence (ACAI 2012), Xiamen, China, 3–5 March 2012; pp. 2231–2234. [Google Scholar]
  30. Ebeid, E.; Skriver, M.; Terkildsen, K.H.; Jensen, K.; Schultz, U. P A Survey of Open-Source UAV Flight Controllers and Flight Simulators. Microprocess. Microsyst. 2018, 61, 11–20. [Google Scholar] [CrossRef]
  31. Lundell, M.; Tang, J.; Hogan, T.; Nygard, K. An Agent-based Heterogeneous UAV Simulator Design. In Proceedings of the 5th WSEAS International Conference on Artificial Intelligence, Knowledge Engineering and Data Bases, Madrid, Spain, 15–17 February 2006; pp. 453–457. [Google Scholar]
  32. Firdaus, M.I.; Rau, J.Y. Comparisons of the Three-demensional Model Reconstructed using MICMAC, PIX4D Mapper and Photoscan Pro. In Proceedings of the 38th Asian Conference on Remote Sensing-Space Applications: Touching Human Lives, ACRS, New Delhi, India, 23–27 October 2017; Available online: https://www.researchgate.net/publication/325216490_comparisons_of_the_three-dimensional_model_reconstructed_using_micmac_pix4d_mapper_and_photoscan_pro (accessed on 16 February 2022).
  33. Saifizi, M.; Syahirah, N.; Mustafa, W.A.; Rahim, H.A.; Nasrudin, M.W. Using Unmanned Aerial Vehicle in 3D Modeling of UniCITI Campus to Estimate Building Size. J. Phys. Conf. Ser. 2021, 1962, 012057. [Google Scholar] [CrossRef]
  34. Gatziolis, D.; Lienard, J.F.; Vogs, A.; Strigul, N.S. 3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles. PLoS ONE 2015, 10, e0137765. [Google Scholar] [CrossRef] [Green Version]
  35. Mairaj, A.; Baba, A.I.; Javaid, A.Y. Application Specific Drone Simulators: Recent advances and challenges. Simul Model. Pr. Theory 2019, 94, 100–117. [Google Scholar] [CrossRef] [Green Version]
  36. Kim, J.K.; Kim, J.S.; Jeon, K.M.; Lee, H.J. Development of Simulator for Shading Area Detecting in UAV Photogrammetry. J. Digit. Contents Soc. 2019, 20, 1917–1923. [Google Scholar] [CrossRef]
  37. Kim, J.K.; Kim, J.S.; Jeon, K.M.; Lee, H.J. Development of Pre-flight Simulator to Improve the 3D Modeling Efficiency using UAV. J. Digit. Contents Soc. 2020, 21, 1705–1711. [Google Scholar] [CrossRef]
  38. Phantom 4 Pro—Product Information–DJI. Available online: https://www.dji.com/phantom-4-pro/info#specs (accessed on 16 February 2022).
Figure 1. Development and validation process of this study.
Figure 1. Development and validation process of this study.
Applsci 12 04454 g001
Figure 2. Architecture and modules of the simulator.
Figure 2. Architecture and modules of the simulator.
Applsci 12 04454 g002
Figure 3. Overall process of automatic path planning in the simulator.
Figure 3. Overall process of automatic path planning in the simulator.
Applsci 12 04454 g003
Figure 4. (a) Isometric view of nadir photogrammetry path, (b) Top view of oblique photogrammetry path, and (c) Side view of oblique photogrammetry path.
Figure 4. (a) Isometric view of nadir photogrammetry path, (b) Top view of oblique photogrammetry path, and (c) Side view of oblique photogrammetry path.
Applsci 12 04454 g004
Figure 5. Validation process of three functions and their indicators in the overall simulation process.
Figure 5. Validation process of three functions and their indicators in the overall simulation process.
Applsci 12 04454 g005
Figure 6. Size of line segment in the image.
Figure 6. Size of line segment in the image.
Applsci 12 04454 g006
Figure 7. Procedure to estimate image coincidence.
Figure 7. Procedure to estimate image coincidence.
Applsci 12 04454 g007
Figure 8. Shaded area relation between simulator and actual flight.
Figure 8. Shaded area relation between simulator and actual flight.
Applsci 12 04454 g008
Figure 9. (a) Appearance and (b) boundary of the Geomexsoft office building. (c) Appearance and (d) boundary of the Research Institute for Gangwon building.
Figure 9. (a) Appearance and (b) boundary of the Geomexsoft office building. (c) Appearance and (d) boundary of the Research Institute for Gangwon building.
Applsci 12 04454 g009
Figure 10. (a) Path planning in the simulator and (b) waypoints comparison near the Geomexsoft building for scenario 1. (c) Path planning in the simulator and (d) waypoints comparison near the Research Institute for Gangwon building for scenario 2.
Figure 10. (a) Path planning in the simulator and (b) waypoints comparison near the Geomexsoft building for scenario 1. (c) Path planning in the simulator and (d) waypoints comparison near the Research Institute for Gangwon building for scenario 2.
Applsci 12 04454 g010
Figure 11. GSD calculation using images and lines.
Figure 11. GSD calculation using images and lines.
Applsci 12 04454 g011
Figure 12. Overlap estimation among images.
Figure 12. Overlap estimation among images.
Applsci 12 04454 g012
Figure 13. Coincidence between two images.
Figure 13. Coincidence between two images.
Applsci 12 04454 g013
Figure 14. Four parts of the area. (a) Common non-shaded area, (b) common shaded area, (c) shaded area not in actual flight but in simulator, and (d) shaded area not in simulator but in actual flight.
Figure 14. Four parts of the area. (a) Common non-shaded area, (b) common shaded area, (c) shaded area not in actual flight but in simulator, and (d) shaded area not in simulator but in actual flight.
Applsci 12 04454 g014
Table 1. Functions and details for the UAV photogrammetry simulator.
Table 1. Functions and details for the UAV photogrammetry simulator.
FunctionDetail
Layer control
  • Layer pan/tilt/zoom
Shape file control
  • Shape file import
2D/3D map synchronization
  • 2D/3D map synchronization
Map control
  • Map pan/tilt/zoom
Point cloud control
  • Point cloud import
Path generation
  • Path layer generation
  • Automatic altitude measuring
  • Parameter setting
  • 2D/3D path visualization
Flight simulation
  • Simulation according to generated path
  • Path export as a mission file or a waypoints file
Shaded area detection
  • Z-buffer ray generation
  • Visual analysis
Table 2. Functions and indicators for validation of the simulator.
Table 2. Functions and indicators for validation of the simulator.
FunctionIndicator
Path planning
  • GSD accuracy
  • Overlap accuracy
Flight simulation
  • Image coincidence rate
Shaded area detection
  • Shaded area detection accuracy
Table 3. Comparison between real flight and simulator in the detection of shaded area.
Table 3. Comparison between real flight and simulator in the detection of shaded area.
Real Flight
Shaded Area
( s r )
Pictured Area
( s t o t a l s r )
SimulatorShaded area
( s s )
True positive
( s c )
False-positive
( s s s c )
Pictured area
( s t o t a l s s )
False-negative
( s r s c )
True negative
( s t o t a l s s s r + s c )
Table 4. Specification of the digital camera used in the test from [38].
Table 4. Specification of the digital camera used in the test from [38].
ItemSpecification
CMOS1”
Pixel Size5472 × 3078
Focal Length8.8 mm
ISO100–3200 (Automatic)
100–12,800 (Manual)
Shutter Speed1/2000–8 s (Mechanical)
1/8000–8 s (Electrical)
Table 5. First test scenario for the validation of the simulator.
Table 5. First test scenario for the validation of the simulator.
ItemValue
GSD2 mm
OverlapForward overlap80%
Side Overlap60%
Table 6. GSD accuracy of the simulator.
Table 6. GSD accuracy of the simulator.
Horizontal DirectionVertical DirectionArithmetic Mean of Both Directions
Designed value (mm)2
Estimated value (mm)2.15192.55002.3510
Error (mm)0.15190.55000.3510
Error rate0.07600.27500.1755
Table 7. Overlap accuracy of the simulator.
Table 7. Overlap accuracy of the simulator.
Forward OverlapSide Overlap
Designed overlap (%)80.0060.00
Estimated overlap (%)78.9064.01
Error (%)−1.14.01
Table 8. Image coincidence rate of the simulator and real flight.
Table 8. Image coincidence rate of the simulator and real flight.
MeanStandard DeviationConfidence Interval (95%)
Lower LimitUpper Limit
86.78%5.00%83.7%89.9%
Table 9. Size of each area (m2) of validation measured by the simulator and the real flight.
Table 9. Size of each area (m2) of validation measured by the simulator and the real flight.
ScenarioObject
Surface
( s t o t a l )
Common Shaded Area
( s c )
Shaded Area in the Simulator
( s s )
Shaded Area in the Actual Flight
( s r )
12328.1351.3690.96137.04
29839.68247.72384.97611.24
Table 10. Shaded area detection performance of the simulator.
Table 10. Shaded area detection performance of the simulator.
IndexScenario 1Scenario 2
Accuracy0.94620.9491
Precision0.56460.6435
Recall0.37480.4053
Specificity0.98190.9851
F1-score0.45050.4973
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kim, J.; Kim, J.; Jeon, K.; Lee, J.; Lee, J. Development and Validation of Unmanned Aerial Vehicle Photogrammetry Simulator for Shaded Area Detection. Appl. Sci. 2022, 12, 4454. https://doi.org/10.3390/app12094454

AMA Style

Kim J, Kim J, Jeon K, Lee J, Lee J. Development and Validation of Unmanned Aerial Vehicle Photogrammetry Simulator for Shaded Area Detection. Applied Sciences. 2022; 12(9):4454. https://doi.org/10.3390/app12094454

Chicago/Turabian Style

Kim, Jisung, Jaekoo Kim, Kyeongmi Jeon, Joonmin Lee, and Jaejoon Lee. 2022. "Development and Validation of Unmanned Aerial Vehicle Photogrammetry Simulator for Shaded Area Detection" Applied Sciences 12, no. 9: 4454. https://doi.org/10.3390/app12094454

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop