Next Article in Journal
Profiling the Planetary Boundary Layer Wind with a StreamLine XR Doppler LiDAR: Comparison to In-Situ Observations and WRF Model Simulations
Next Article in Special Issue
Application of UAS with Remote Sensing Sensors for the Location of Marks in the Archaeological Site of the Europos, Greece
Previous Article in Journal
Investigating NUCAPS Skill in Profiling Saharan Dust for Near-Real-Time Forecasting
Previous Article in Special Issue
Transforming 2D Radar Remote Sensor Information from a UAV into a 3D World-View
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

VR Multiscale Geovisualization Based on UAS Multitemporal Data: The Case of Geological Monuments

by
Ermioni-Eirini Papadopoulou
1,*,
Apostolos Papakonstantinou
2,
Nikoletta-Anna Kapogianni
3,
Nikolaos Zouros
1 and
Nikolaos Soulakellis
1
1
Department of Geography, University of the Aegean, 81100 Mytilene, Greece
2
Department of Marine Sciences, University of the Aegean, 81100 Mytilene, Greece
3
Department of Informatics, University of Piraeus, 18534 Piraeus, Greece
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(17), 4259; https://doi.org/10.3390/rs14174259
Submission received: 4 July 2022 / Revised: 19 August 2022 / Accepted: 25 August 2022 / Published: 29 August 2022

Abstract

:
Technological progress in Virtual Reality (VR) and Unmanned Aerial Systems (UASs) offers great advantages in the field of cartography and particularly in the geovisualization of spatial data. This paper investigates the correlation between UAS flight characteristics for data acquisition and the quality of the derived maps and 3D models of geological monuments for VR geovisualization in different scales and timeframes. In this study, we develop a methodology for mapping geoheritage monuments based on different cartographic scales. Each cartographic scale results in diverse orthophotomaps and 3D models. All orthophotomaps and 3D models provide an optimal geovisualization, combining UAS and VR technologies and thus contributing to the multitemporal 3D geovisualization of geological heritage on different cartographic scales. The study area selected was a fossilite ferrous site located in Lesvos Geopark, UNESCO. The study area contains a fossil site surrounding various findings. The three distinct scales that occur are based on the object depicted: (i) the fossilite ferrous site (1:120), (ii) the fossil root system (1:20), and (iii) individual fossils (≥1:10). The methodology followed in the present research consists of three main sections: (a) scale-variant UAS data acquisition, (b) data processing and results (2D–3D maps and models), and (c) 3D geovisualization to VR integration. Each different mapping scale determines the UAS data acquisition parameters (flight pattern, camera orientation and inclination, height of flight) and defines the resolution of the 3D models to be embedded in the VR environment. Due to the intense excavation of the study area, the location was spatiotemporally monitored on the cartographic scale of 1:120. For the continuous monitoring of the study area, four different UASs were also used. Each of them was programmed to fly and acquire images with a constant ground sampling distance (GSD). The data were processed by image-based 3D modeling and computer vision algorithms from which the 3D models and orthophotomaps were created and used in the VR environment. As a result, a VR application visualizing multitemporal data of geoheritage monuments across three cartographic scales was developed.

1. Introduction

Mapping techniques with remote sensing and three-dimensional (3D) earth modeling have now achieved significant progress both in terms of vehicles and sensors, as well as the methods and software used [1].
In the last decade, UASs received a lot of attention as platforms equipped with recording sensors capable of automated missions for quick 2D and 3D data production [2,3,4,5]. The advantage of UAS systems is their ability to map and monitor with high temporal and spatial resolution various phenomena on the earth’s surface [6,7,8]. Moreover, UASs allow a quick, easy, and low-cost method of data acquisition in a number of critical situations where immediate access to 3D geo-information is crucial [7,9,10,11]. They can be used in high-risk situations and inaccessible areas to monitor spatiotemporal changes and phenomena. In some cases where a large cartographic scale is demanded, UASs can be a complement to or a replacement of terrestrial acquisition [12]. The high-resolution aerial images acquired by UASs can be used not only for the generation of dense point clouds, but for texture mapping of 3D data, for orthophoto production, high-detail digital surface model creation, or 3D building modeling [13,14,15,16,17,18,19,20]. UASs decrease operational costs and reduce the risk of access in harsh environments while still keeping a high accuracy potential [12].
In the last decade, parallel to the advancements in UAS technology, the spatial resolution of satellite imagery has significantly improved. Until now, satellite data are still not sufficient for mapping and monitoring very small changes (centimetric precision). Combined with various miniaturized high-precision sensors, UASs can provide high-resolution aerial images in combination with Ground Control Points (GCPs), and Post Processing Kinematic (PPK) and Real Time Kinematic (RTK) methods, to create accurate geo-information at a low cost [21,22]. Unmanned aerial systems (UASs), consisting of a UAS and a sensor, provide digital images with spatial and temporal resolutions capable of overcoming some of the limitations of spatial data acquisition using satellites and airplanes. The increase in the flight capabilities and agility of UASs, as well as in the endurance and the variation of the onboard sensors and tools available, can be used for various monitoring tasks for a plethora of spatiotemporal phenomena and environmental parameters [23,24,25]. Several recent publications have described methods and techniques that measure spatiotemporal changes using UASs [5,15,26,27]. UASs are a viable option for collecting remote sensing data for a wide range of practical applications, including scientific, agricultural, and environmental applications [28,29,30,31].
New technologies in remote sensing that emerged in the 21st century and the advent of UAS in data acquisition changed the mapping process and reshaped visualization products. UAS aerial data offer a unique opportunity to measure, analyze, quantify, map, and explore phenomena at high temporal frequencies and at very high ground resolution [32,33]. High-resolution and scale-variant UAS data contribute significantly to the cartographic and visualization process of spatiotemporal geo-information [5]. Thus, new visualizations can be implemented to support the identification and visualization of new patterns, potential relationships related to the spatiotemporal trends of various phenomena, and geo-information extraction.
Geovisualization, as a subcategory of scientific visualization in the field of computer technology, is the visualization of geospatial data, where spatial patterns and relationships in complex data are interpreted using visual representations [34,35]. Interactive multimedia technology has contributed substantially to the development of geovisualization tools in cartography, with multimedia cartography being a new and efficient way of accessing and delivering geospatial information, both for professionals in the field and the public who now utilize maps on a daily basis [36]. This method extends beyond the context of traditional cartography, requiring further investigation on how it can be utilized in the development of already available and continuously growing geospatial databases [37,38,39,40]. Despite technological advances and the variety of 3D visualization applications available, where spatial data is conveyed through different scales and from global to local levels, there is still unknown ground on a conceptual basis and in how they affect human perception [41].
In the era of digital information technologies, 3D modeling and computer graphics techniques apply to the development of virtual models for computer simulation, artificial intelligence (AI), big data analytics, etc., and to various applications in virtual reality (VR) [42]. Virtual Reality (VR) is the technology that provides an almost real and believable experience in a synthetic or virtual way. The goal of immersive VR is to completely immerse the user inside the computer-generated world, giving the impression that the user has “entered” the synthetic world. With the right level of immersion, VR can support a wide range of uses, including:(i) training and education [43], (ii) customer experience [44,45], (iii) entertainment [46], (iv) travel and tourism [47,48], and nowadays (v) geovisualization [26,49].
The number of contemporary visualization techniques applied to geographic data is rapidly increasing [50,51,52,53]. VR techniques challenge the means of geovisualization, as their potential applications in cartography and geoinformatics have not been thoroughly investigated [54,55,56,57]. In the present studies, references to VR applications present implementations and approaches to the visualization of geographic data [58,59,60,61]. However, the utilization of 2D and 3D maps in VR has been assessed only to a limited extent [62,63,64,65]. Although photogrammetric 3D models derived from images acquired from UAS are employed in VR, they do not always conform to established cartographic principles. In the recent literature, geovisualizations in VR are carried out using 3D terrain models and 3D point clouds [66,67]. Terrain models and VR point clouds concern smaller cartographic scales such as entire drainage basins or large geographical areas [68,69]. In the existing studies, a comparison is made between different recording sensors such as high-resolution cameras and LiDAR, only for the same area but not for various scales [66]. Based on the research to date regarding geovisualizations in a VR environment, it is observed that neither is the concept of cartographic scale considered, nor is a flight mode (flight height, camera angle, etc.) proposed for the collection of high-resolution images for the accurate and precise creation of 3D models of geographical areas. In addition, a way of visualizing spatiotemporal data and monitoring dynamic phenomena in virtual reality is not present in prior research. Thus, to cover the above gaps, this work focuses on proposing a way to collect data according to the cartographic scale to produce 3D models suitable for VR geovisualizations. The major challenge is to achieve the application of the cartographic principles and scale issues of 2D and 3D maps and geospatial data when imported into a VR environment.
This paper studies the applicability of UAS data acquisition, flight characteristics, and the quality of the derived maps and 3D models of geoheritage areas in VR geovisualization. This research aims to develop an overarching methodology that utilizes cartographic scale for the geoheritage monuments’ multiscale 3D mapping. The objective is to devise and implement UAS flight characteristics (flying altitude, pattern, and camera angle) to generate high-resolution 2D maps and 3D models sufficient for VR geovisualization across the three cartographic scales. The proposed UAS flight characteristics can be applied in various timeframes. This highlights the advantages and practicability of UAS in monitoring dynamic activities such as the formation, geoconservation, and promotion of sites with high geological significance.
The main contribution of this work is the utilization of cartographic scale principles for the creation of efficient visualizations in VR. Cartographic principles are applied, and the spatial resolution of images is correlated with the detail and precision of the 3D models produced from data collected via UASs. The flight altitude, path, and camera angle of the UAS affect both the geometry and the precision of the 3D models as well as the quality of the texture. More specifically, a geometrically complex object needs lateral shots to be captured with detail in 3D, while a flat area can be captured by only vertical shots. This, combined with the mapping scale and the eye’s visual acuity, determines the resolution of the 3D model to efficiently geovisualize the area/object studied. The above parameters are important for visualizations in VR, as the user of the application has direct contact with the 3D model and can observe it circumferentially. This study focuses on the integration of UAS technology and VR techniques for an efficient multitemporal 3D geovisualization of geological monuments on various cartographic scales.

2. Materials and Methods

2.1. Study Area

The paper’s study area is part of the wider area of the Petrified Forest of Lesvos in the northeastern Aegean Sea in Greece. It consists of a rare, petrified forest ecosystem with concentrations of petrified trees covered by volcanic material and fossilized in place millions of years ago. Petrified tree trunks, branches, roots, and tree leaves are revealed under layers of volcanic ash [70,71].
By Presidential Decree (PD 443/1985), the Petrified Forest was declared a natural monument. Every small or large part of the fossilized trunks in the area is protected by law. The Petrified Forest area was a founding member of the European Geoparks Network in 2000 and joined the UNESCO World Geoparks Network in 2004 [72,73].
The new road opening between Kalloni and Sigri brought new fossil sites to light. Several standing and fallen tree trunks were discovered alongside the road, especially in the western part. For this reason, the sites with a high concentration of fossils had to be optimally configured to preserve and protect the findings. Parks for visitors have been established in areas with a high concentration of petrified trees. This location is a representative example of the fossilite ferrous sites, containing petrified tree trunks that appear at different altitudes and an impressive root system (Figure 1). The area is located 3 km before reaching the settlement of Sigri, and its orientation is west.
The Museum of Natural History of the Petrified Forest of Lesvos carried out procedures for protecting, conserving, and promoting fossils, making the site accessible to the public. The above management processes were completed at various time intervals, and the alterations in the area were obvious. Thus, the methodology presented in this work was developed to spatiotemporally monitor the alterations and the development of work on the fossilite ferrous site.

2.2. Methodology

The following methodology was developed to integrate UAS technology with VR techniques for the most efficient multitemporal 3D geovisualization of geological heritage on different cartographic scales (Figure 2). A scale investigation of the geographical area of the site is first conducted. More specifically, the three geographical scales are: (i) the fossil site (1:120), (ii) the fossil root system (1:20), and (iii) individual fossils (≥1:10). From the determination of the geographical scale, three corresponding cartographic scales emerged: (a) 1: 120, (b) 1:20, and (c) 1:10. The definition of 3 different cartographic scales leads to the suitable resolution of the cartographic results. The ground sampling distance (GSD) of the very high-resolution images (VHRI) and the camera features of each UAS determined the flight altitude for each cartographic scale. The flight pattern and the camera orientation were then configured for each case. Along with the image acquisition, the ground control points (GCPs) were collected, and were used for georeferencing.
The raw data collected were then inspected for their suitability and reliability in a post-process step. VHRIs were georeferenced with GCPs, and then the structure from motion and multiview Stereo algorithms were applied to create the 3D dense point cloud and the 3D model via Metashape by Agisoft [74]. Then, the digital surface model (DSM) and the orthomosaic of the study area were created. The data were acquired on eight different dates, simultaneously with the maintenance and promotion work in the fossil site area. The image-based 3D modeling processing results were imported into ArcGIS Pro software [75], where 2D and 3D maps were created. The 2D and 3D maps were then published online via ArcGIS Enterprise [76].
The next stage in the methodology examines the visualization of 2D and 3D multitemporal and multiscale results in a virtual reality environment. Initially, the VR application scenario was structured. The scenario comprises two virtual gallery rooms, one for the spatiotemporal geovisualization of the results and a second for the multiscale geovisualization. Specific interactive features between the user and the virtual space were programmed in both rooms, along with interactive features for each room. Some interactive features were combined with the geographical information systems (GIS) through a specific software development kit (SDK) that allows the transfer of real coordinates in virtual space. The application’s functionality was tried and tested using VR equipment and was finally launched in an executable game format (.exe) compatible with the Windows operating system.

2.3. UASs and GNSS

Data were acquired from the area using four different UASs, with different recording sensor and lens features (Table 1). More specifically, the first UAS used was the Phantom 4 Pro by DJI. The DJI Phantom 4 Pro is a mid-range quadcopter aircraft (1375 g in weight) with a three-axis stabilized camera. The Phantom 4 Pro camera has a 1-inch, 20-megapixel CMOS sensor FC6310 and a manually adjustable aperture from f2.8 to f11. It also has a focus range from 1 m to infinite. The Phantom 4 Pro uses a camera lens optimized for aerial imaging, with a 24 mm equivalent focal length and 9 mm real focal length. The actual sensor size is 13.2 × 8.8, and the images are 3:2. The aspect ratio is 5472 × 3648 pixels. The second UAS was the Mavic 2 Pro, equipped with a Hasselblad camera. This camera has a 20-megapixel resolution and a 1” CMOS sensor. The sensor’s dimensions are 13.2 × 8.8 mm, and the resolution of the images in the 3:2 ratio is 5472 × 3648 pixels. The Hasselblad apertures are f/2.8–f/11 and the lens is fixed with a true focal length of 11 mm. The body of this UAS weighs 907 g. In addition, it has a collision sensor and four propellers. The third UAS used for data collection was the Inspire 2, a quadcopter weighing 3400 g and belonging to the middle class of UASs. It has a gimbal with a three-axis stabilizer where different recording sensors can be placed. The recording sensor used for this task was the Zenmuse X5S camera. This camera has a CMOS sensor 4/3 and a resolution of 20.8 mpxl. The actual size of the sensor is 13 × 17.3, and the resolution of the images captured is 5280 × 3956 pixels. Different lenses can be mounted on the X5S. For the purposes of this recording, two lenses were used: (a) a DJI MFT with a focal length of 15 mm and an aperture of 1.7, and (b) an Olympus M.Zuiko with a focal length of 25 mm and an aperture of 1.8. The fourth UAS used for data acquisition was the Matrice 300, equipped with a Zenmuse P1 camera. The camera specifications are: (i) 48 mpxl resolution, (ii) f/2.8–f/16 aperture, (iii) full-frame recording sensor with 35.9 × 24 mm actual size, and (iv) 8192 × 5460 pixels resolution. The Zenmuse P1 supports various lenses with different focal lengths. In this study, a 35 mm lens was used.
The Hiper SR receiver was used for the Global Navigation Satellite System (GNSS) measurements. This equipment consists of two parts, (i) the base and (ii) the rover, which communicate with each other. The Hiper SR receiver has 226 Channels for Universal Tracking and receives the signal from GPS and GLONASS. It can operate in 4 different modes: (a) Static/Fast Static, (b) Precision Static, (c) Real Time Kinematic (RTK), and (d) DGPS. For the present work, the RTK mode was chosen, the accuracy of which reaches up to 10 mm horizontally and up to 15 mm vertically.

2.4. Virtual Reality (VR)

Virtual reality techniques and equipment were used for the 3D geovisualization of the results obtained after processing the data collected with UAS. The VR headset was the Valve Index model. This package includes the VR headset, two controllers, and two base stations. The Valve Index Headset display includes stereo RGB screens, allowing immersion in the virtual world. The Headset has two LCD 1440 × 1600 screens, full RGB per pixel, and a super low total backlight stay (0.330 ms at 144 Hz). The response rate of the frames projected on the cameras is 80/90/120/144 Hz. They have an ergonomic design as they allow the adjustment of: (i) the interstitial distance of the convex lenses with 58–70 mm range, (ii) the distance of the lenses from the eyes (front–back), (iii) the size of the perimeter of the head, and (iv) the position of the headphones, with a natural fit. The Headset headphones have a balanced operation of 37.5 mm with a frequency response of 40 Hz–24 KHz.
The VR glasses work in combination with the two base stations and the two controllers. Valve Index controllers are compatible with any headset that supports SteamVR [77] detection. Each controller corresponds to either the left or the right hand and detects hand and finger position, movement, and pressure through 87 sensors to determine the user’s intention. Valve Index controllers allow reaching out and grasping an object directly instead of relying on buttons such as triggers. In addition, they have a wrist strap that allows the opening of the palm so that objects can be thrown. The strap is easy to secure and fits a variety of hand sizes. The main body of each controller is the input of the handle, with built-in power sensors set to detect a wide range of forces from a gentle touch to a strong grip. This improves physical actions such as gripping and throwing objects and introduces new interactions such as tightening and crushing. The ten buttons on each controller can be programmed or deactivated depending on the needs of the game.
The package includes two base stations. Valve Index Base Stations are equipped with fixed lasers that scan 100 times per second to detect photon sensors on the headset and controllers. Range 2 base stations cover an area of 7 × 7 m, and their field of vision is 160° × 115°. These sensors are also compatible with other VR headsets such as the HTC VIVE. This equipment operates via Steam VR. SteamVR is a tool for experiencing virtual reality content that is compatible with Valve Index. This tool allows you to delimit the area where VR equipment is used and is fully compatible with game engine software such as Unity and Unreal engine.

3. Data Acquisition and Processing

3.1. Flight Planning

The first stage of data acquisition was scale investigation. A total of three geographical scales emerged, depending on the observation needs of the fossil site’s management board. The largest geographical scale includes the entire fossil-bearing site. The site’s configuration was carried out from 2018 to 2021, resulting in the need to monitor the work’s progress and capture the area at different dates. The second geographical scale that emerged was at the root-system level, and the third scale was at the fossil level, where conservation procedures of the root and individual fossils were carried out. The two smaller geographical scales were not captured over time as the findings are covered with special protection material during fieldwork. The geographical extent of each case led to three different cartographic scales: (i) ≥1:10, (ii) 1:20, and (iii) 1:120. This defined the required GSD of the images acquired for the respective cartographic scales: (i) ≤0.1 cm, (ii) ≤0.20 cm, and (iii) ≤1.3 cm.
The GSD of the images and the camera’s features determine the flight altitude of a UAS. In this case, three different aircraft and four different recording sensors were utilized, and the flight altitude was calculated separately for each camera. Table 2 lists the flight dates, the UASs and the recording sensor that collected the data, the cartographic scale, the GSD, and the corresponding flight altitude.
Data collection was performed on different dates between July 2018 and May 2022. In 2019, no flights were operated due to the absence of work for the site’s configuration. In 2020, all maintenance and promotion processes ceased due to the COVID-19 pandemic. The first flight was executed on 11 July 2018, with a Phantom 4 Pro. This flight covered a large geographical area as its purpose was to define clear boundaries for the site (Figure 3a). The flight was executed at a 30 m flight altitude, and the GSD of the images was 0.8 cm. The GSD was less than 1.3 cm, so the data was suitable for a 1:120 cartographic scale. The overlap of the VHRI was 80% front and 70% side. The flight lasted 20 min in total, and 600 images were collected. The second flight took place on 13 March 2021, with a Phantom 4 Pro for a 1:120 cartographic scale (Figure 3b). Up to this date, the site limits had been set, so the mapping area was limited to the fossil-site level. The flight was accomplished at 40 m, the GSD of the images was 1.07 cm, and the overlap of the images was 80% front and 60% side. The flight lasted 8 min in total. The third flight was performed on 8 May 2021, with the Inspire 2 and the Zenmuse X5S camera with the Olympus 25 mm lens, which mapped the area at a scale of 1:120. The flight altitude was 50 m, and the spatial resolution of the images was 0.5 cm. The overlap of the images was 80% front and 70% side, and 130 images were collected. The total duration of this flight was 7 min. On the same date (8 May 2021), the flight captured the area at the fossilized root-system level, i.e., at a scale of 1:20. The aircraft followed a pattern perimetric to the point of interest for 3D mapping at a 1:20 scale (Figure 3c).
On 6 July 2021, the Inspire 2 and the DJI MFT camera (15 mm) acquired data in a grid form and at a flight altitude of 40 m. The images had a GSD of 0.9 cm (<1.3 cm) and an 80% front and 60% side overlap. The next flight took place on 21 July 2021, with the Mavic 2 Pro, where the flight altitude was 40 m and the GSD 0.16 cm (<1.3 cm). An identical flight with the same equipment was performed on 11 September 2021 and acquired data for mapping at a 1:120 cartographic scale. All flights were designed considering the area’s high relief, with the aircraft flying at a constant distance above the ground. In addition, the camera was set at a −90° angle, i.e., vertical to the ground. On September 11, a very low flight was conducted at 5 m with the Inspire 2 and the Zenmuse X5S (25 mm) camera. This flight aimed to produce an orthophoto map and a high-resolution 3D photorealistic model at a cartographic scale of 1:10. The pattern followed was perimetric to the site with the camera in an oblique position (Figure 3d). Additional images were acquired perpendicular to the fossil at the same flight altitude. Work on the fossil-bearing site was also recorded on 13 October 2021. The flight was conducted with the Inspire 2 (DJI MFT, 15 mm) at 40 m flight altitude. The 160 images acquired had an 80% front and 70% side overlap. Their GSD was 0.9 cm (<1.3 cm). All the flights mentioned above were performed through the Litchi Mission Hub software [78].
The most recent recording date of the site was 15 May 2022. The Matrice 300 and the Zenmuse P1 (35 mm) camera were used. The flight of the Matrice 300 had an altitude of 50 m, and the images had a GSD of 0.5 cm (<1.3 cm), suitable for mapping the area on a 1:120 cartographic scale. The flight was designed with the DJI Pilot application [79]. The front overlap of the images was 80% and the side overlap was 60%. A total of 170 photos were collected and the flight’s duration was 13 min. The study area presents several difficulties in data acquisition due to the intense elevation differences in the territory occupied by the fossil-bearing site. In addition, its northwest orientation creates intense shadows on the slopes and the steep sides inside the trenches in the morning hours. Another difficulty encountered during data acquisition was the dust emanating in the atmosphere due to the ongoing work at the site.

3.2. Georeferencing

VHRIs georeferencing was performed using ground control points (GCPs). The points were measured with the Hiper SR receiver and the RTK method [80]. The coordinates system used to measure the GCPs and georeference the VHRIs was the Greek Grid 2100. In more detail, 20 GCPs were measured in the surrounding area of the fossil site. The position points were collected on 11 July 2018. Figure 4 shows the study area calculations. The positions used as control points are displayed in red, and the checkpoints in yellow. In addition, checkpoints are concentrated on the southeast side of the area because defining the position’s boundaries has been the main target since the first date of the position recording. Consequently, points that demarcated the wider area were acquired. Then, the monitoring area was limited to the north and west sections where fossils appear. Therefore, the VHRIs were georeferenced with the four red GCPs, located around the perimeter of the fossil site and at different altitudes. The same control points were utilized on all recording dates as they were located at established positions in the area. The internal measurements of the position were the checkpoints. However, these calculations could not be utilized on all dates as the maintenance and promotion work in the area was intense. More specifically, protective walls, paths, and individual positions were created in the area, resulting in the disappearance/loss of several checkpoints.
Based on the 1:120 cartographic scale, in which the entire fossilite ferrous site was mapped, the acceptable error of horizontal and elevation accuracy is 4 cm [80,81]. Table 3 shows the GCPs errors by x, y, z axis and the total RMS for each recording date for the 1:120 cartographic scale. More specifically, the total RMS of the data georeferencing was: (i) 1.74 cm on 11 July 2018, (ii) 3.12 cm on 13 March 2021, (iii) 2.73 cm on 8 May 2021, (iv) 3.54 cm on 6 July 2021, (v) 3.53 cm on 21 July 21, 2021, (vi) 3.38 cm on 11 September 2021, and (vi) 4.06 cm on 13 October 2021.
At the last recording date on 15 May 2022, the equipment consisted of the Matrice 300, a UAS with a built-in RTK receiver. Thus, in combination with a D-RTK DJI, it is possible to acquire primary data with very high accuracy. The D-RTK was placed at a point with known coordinates and created a network between the aircraft and the reference (D-RTK base), which helped achieve an error of 4.01 cm. The data coordinates were then transferred from the World Geodetic System 1984 (WGS 84) to the Greek Grid 2100 projection system. The horizontal and altitude accuracy of the data acquired with the Matrice 300 were checked by GSPs placed in the area on previous dates.

3.3. Image-Based 3D Modeling Process &Results

According to the methodology, the next stage was data processing to create a 3D point cloud, 3D models, and very high-resolution orthomosaics. The data were processed with photogrammetric and image-based 3D modeling methods to produce the cartographic results. The steps followed for image-based 3D modeling processing were the same for all datasets at each recording date.
Initially, quality control of VHRIs was performed visually, first, by an expert photo interpreter, and then, using the Image Quality Index (IQI) algorithm [82]. Images that were blurred, shaken, overexposed, or included parts of the horizon were excluded through visual controls. Then the VHRIs that showed values outside the limits of 0.5–1 in the IQI index were rejected through the next steps in processing. The VHRIs suitable for photogrammetric processing were introduced in the Agisoft Metashape [74] software, where the alignment of the images was applied. The alignment process includes the implementation of structure from motion (SfM) [83,84], which consists of two algorithms, the scale invariant feature transform (SIFT) [85,86] and the random sample consensus (RANSAC) [87,88]. The application of the above in VHRIs results in a sparse point cloud. A denser point cloud is then generated using the multiview stereo (MVS) algorithm [83,89]. The resulting 3D dense point cloud is the basis for creating a 3D mesh. Specifically, through spatial interpolation, the points are connected in a triangulation irregular network (TIN) which consists of a single 3D mesh. The 3D mesh is enriched with a photorealistic texture, and a textured 3D model emerges. This is followed first by the creation of the Digital Surface Model (DSM), which describes the altitude of the area, and finally by the orthorectification of the images’ pixels to create the orthomosaic. This processing was applied to the data acquired on the eight recording dates for each of the three different scales. The image-based 3D modeling processing results utilized to develop the VR application were 3D models and orthophotomaps, as presented in Table 4.
More specifically, the 3D model and the orthophotomap produced for 11 July 2018 included two standing petrified tree trunks. The part of the area limited by the fence was parallel to the road, and the location was not yet accessible to the public. On the next recording date 13 March 2021, increased activity due to work resulted in changes. There were transported lying petrified tree trunks, the fossils had been covered with a special protective material, and the first demarcation fences for each site were installed. On 8 May 2021, the fossil ferrous site margins were designated with the wall raised around it. At the beginning of July in the same year, works in the fossilized root system as well as in various standing tree trunks of the study area were recorded. By the end of 21 July 2021, the site had assumed a distinctive form as the viewing level of the petrified trunks was comprehensible. The first paths had been built, numbered signs had been placed on each find, and trunks had been transported in a suitable display position. In addition, the fence had been considerably extended towards the side of the road. On 11 September2021, the construction of a staircase in the northwestern part of the site allowed access to its highest level. In addition, the equipment and tools had been removed from the area. The vegetation had been removed from the paths, and the site and the construction of the stone walls had been completed. On the last date of mapping of the fossil site for 13 October 2021, wooden fences and stairs appeared at the perimeter of the findings. On 13 October 2021, all works for promoting and maintaining the fossil site were completed, and the site was fully configured for the public to visit. A flight was performed in the spring of 2022 to capture the site’s state a few months after the completion of the works.
At this point, the site was accessible to the public, as the protective material placed during the winter months had been removed from the fossils. Low vegetation was present, as the images used in the orthophoto map were acquired during spring on 15 May 2022.
The two models and two orthophotomaps created for the larger cartographic scales were: (i) at the root-system level (1:20) and (ii) at the fossil level (≥1:10). The 3D model created on a 1:20 cartographic scale depicts the condition of the fossil on 8 May 2021, immediately after the protective material’s removal. This model allows the inspection of the root system and works supportively in its maintenance process. In addition, it functions as a log file for monitoring over time. The orthophotomap shows the root’s structure and branches as well as the parts with erosion risk. The 3D model of the individual fossil visualizes the find with very high resolution and in detail. Due to the high spatial resolution of the VHRIs, the texture and colors of the fossil are rendered with great photorealism, while parts of the bark and its grooves can be observed. The lying petrified trunk selected for mapping at a scale of 1:10 has the unique characteristic of being transferred to the fossil–ferrous site under study after being found at a nearby geographical location. Since it can be transported, its high-resolution mapping was important for its maintenance and conservation.

3.4. 3D Point Cloud and 3D Models Evaluation

Table 5 presents data and information regarding the 3D models created for the fossil site, root system, and fossil trunk. In particular, it is observed that the 3D point cloud and the number of the 3D model’s surfaces created on 11 July 2018 capture the wider region, thus a larger geographical area. For this reason, the specific 3D model consists of more surfaces compared to the rest of the models at the same cartographic scale (1:120). The only 3D model showing an even higher number of points and surfaces is the one created from the images collected on 15 May 2022. This occurs because the 2022 flight was performed with the Matrice 300 and the Zenmuse P1 (35 mm) camera. The Zenmuse P1 camera has a resolution of 48 mpxl, while the rest of the cameras used in this study have resolutions up to 20.8 mpxl. The higher resolution of the camera affects the number of points in the 3D dense point cloud, and this in turn determines the number of mesh surfaces. According to the table, 3D point clouds and models created for the cartographic scales of 1:20 and 1:1 gather a high number of surfaces and points. Compared to point clouds and 3D models for the 1:120 cartographic scale, they have fewer points and surfaces, but proportional to the area depicted in each case, the density of points and the detail in the surfaces is significantly higher.
To evaluate the 3D point clouds and 3D models, a comparison was made between the point clouds produced from images collected by UASs to generate 3D geovisualizations at 1:120, 1:20, and ≥1:10 scales. In essence, the 3D geovisualizations created for the three cartographic scales were compared. Comparisons were made between the point clouds as they formed the basis for the subsequent creation of the 3D models. The greater the number of cloud points, the greater the number of triangulation irregular networks (TIN) that make up the mesh. The larger number of surfaces describes the geometry of the area mapped (root system and fossil) in more detail. Therefore, for 3D mapping at large cartographic scales such as 1:20 and ≥1:10, a large number of points in the cloud is required to result in a high-resolution 3D mesh. The comparison of point clouds was performed in the Cloud compare software (CloudCompare 2.11, 2020) and the methods used were Cloud to Cloud (C2C) distance and surface density.
Initially, the densest point cloud created from data collected for the cartographic scale of 1:120 (15 May 2022) was selected. From this point cloud the parts of the petrified root system and trunk were isolated. Then a comparison was made between the clouds of (i) RS-1: root system (1:20)–RS-2: root system (1:120) and (ii) PTT-1: petrified trunk (1:1)–PTT-2: petrified tree trunk/fossil (1:120), using the C2C distance and surface density methods.
Figure 5 shows the results of the comparison between the point cloud of the root system created from the data from 8 May 2021 (Figure 5d), and the point cloud created by the images collected on 15 May 2022 (Figure 5e). Figure 5a presents the results of the C2C method. The densest point cloud, namely the RS-1 cloud, was defined as the reference cloud and the algorithm was configured to locate points between the clouds at a distance of 2 cm. This distance was chosen as the desired mapping scale is 1:20. Moreover, the biggest differences between RS-1 & RS-2 are observed in the standing part of the petrified tree trunk, while in the flat part of the root system the distances between the two clouds are smaller, due to the way in which the images for 3D mapping were collected. The images used to create the RS-1 were oblique, thus achieving a more detailed 3D imaging of the entire petrified root system, including the standing part. On the contrary, the images utilized for the creation of RS-2 were vertical to the ground, resulting in the consequence that the complex geometric features of the petrified root system were not fully captured. In Figure 5a, it is observed that in the areas depicted in red, the distance between the point clouds is 2 cm or more. It is therefore concluded that the compared cloud (RS-2) has fewer points in the red regions than the reference cloud RS-1. This observation is also verified by the results of the surface density applied separately to each point cloud (RS-1 & RS-2). Figure 5b presents the surface density of RS-1. The cell size with which the surface density was applied was 2 × 2 cm. The areas in bright orange indicate the areas with a higher concentration of points. The largest part of the point cloud of the root system shows a high concentration of points, from 54,000 to 72,000. In addition, there are parts of the RS-1 containing up to 142,178 points per 4 cm2. Figure 5c depicts the surface density of the RS-2 per 2 × 2 cm. The cloud appears sparser, as the geometry of the root system is not clearly discernible. The larger part of the RS-2 shows a point density of 800 to 3200 dots per 4 cm2.
Equivalent comparisons were made for the point clouds of the individual petrified tree trunk, PTT-1: petrified tree trunk/fossil (≥1:10)—PTT-2 petrified tree trunk/fossil (1:120). Figure 6a shows the results of the C2C distance algorithm, which was applied for distances under 2 cm. The densest point cloud, i.e., PTT-1, was set as the reference cloud. The red areas appearing on the lower part and the side parts of the trunk as well are interpreted as a low number of points in the PTT-2 cloud. This occurs as the images of 15 May 2022, used to create PTT-2, were captured vertical to the ground, excluding the side parts of the fossil. On the contrary, the images for the mapping of the trunk at a cartographic scale of 1:10 were both vertical and oblique, as well as being collected from a much lower flight altitude (5 m). This significantly affects the surface density. Characteristically, the density of PTT-1 points is displayed in Figure 6b, where the largest part of the fossil shows 2500 to 4500 points per 4 cm2, whereas the PTT-2 surface density shows 2,000,000 to 3,200,000 points per 4 cm2. Finally, the total number of points of PTT-1 (Figure 6d) is considerably larger (11,342,955) than that of PTT-2 (14,645). The number of points significantly affects the 3D models’ solidness. In the 3D mapping of complex geometric features such as a petrified tree trunk, a high number of points is required to achieve its accurate 3D representation. The high resolution of the 3D model in this case is important for the fossils’ conservation, protection, and promotion.

4. VR Geovisualization

4.1. VR Scenario

The design of VR applications is based on a scenario. The scenario is formulated to thoroughly cover the information that needs to be transferred through the virtual navigation [90,91]. The VR geovisualization of the spatiotemporal monitoring of the maintenance works and the promotion of the fossilite ferrous site studied in the present work was based on the scenario presented in Figure 7. The application developed supports a single user, but more observers can be added through suitable programming. This scenario had to meet the need for the geovisualization of two different but interrelated thematic units: (a) multitemporal and (b) multiscale. The main idea was the creation of two rooms, one for each thematic unit, one for the spatiotemporal monitoring of the site at a cartographic scale of 1:120, and another for the 3D geovisualization at various scales.
Firstly, a map gallery room was designed. Panels were placed at the room’s perimeter showing the orthophoto maps created on a 1:120 scale for all recording dates. The maps were arranged in chronological order, beginning at the map created on the oldest recording date (11 July 2018) to the most recent (15 March 2022). The user can interact with the orthophoto maps and move to a different room through the panels. Each room concerns the corresponding date and the corresponding 3D imprint of the fossil site. The user observes the fossilite ferrous site from the corresponding flight altitude at which the UAS flew for the needs of a cartographic scale of 1:120. In the multitemporal room, a map contains all the geosites of Lesvos Island, emphasizing the fossil site located in the western part of the island. In the same room, a portal allows transfers from and to the room with multiscale geovisualizations. The second room was designed with the same pattern. The multiscale room was smaller, with three panels containing the maps corresponding to three different cartographic scales: (i) 1:120, (ii) 1:20, and (iii) ≥1:10. The maps were arranged from the smallest to the largest cartographic scale. The user can interact with the maps by changing their format from 2D visualization of the area to its respective 3D visualization. A portal in this room allows the user to be transferred to the multitemporal room.
The virtual tour begins at the center of the multitemporal room, where most orthophotomaps of the study area are located, based on the time of data collection for their creation. The user has the ability to deliberately move in the space and observe any panel, although practically, they are guided through the chronological order of the maps. The first image is the map of Lesvos for the user’s orientation regarding the geographical location of the fossil site. The timeline moves clockwise and ends with the most recent date. The portal at the end of the timeline contains an indication for moving to the multiscale room. In the second room, the user can move independently inside the space. However, the imaginary guide from the smallest to the largest cartographic scale makes the transition from the general to the individual comprehensible.

4.2. VR Processing and Programming

The 2D and 3D cartographic derivatives of the image-based 3D modeling process were used in creating the VR application. The 3D models and orthomosaics created in the Agisoft Metashape software [74] were exported in 3D object format (.odj) and raster format (.tiff), respectively. The orthomosaics were introduced into the ArcGIS Pro software (version 3.0.0) for cartographic process and composition. The process of cartographic composition resulted in 10 orthophotomaps which were then used to develop the virtual reality application. The 3D models were utilized in two ways to develop the application. The first was through ArcGIS Pro [75] and the second was as original 3D models (meshes). Regarding ArcGIS Pro [75], the 3D grid, materials, and jpg of each 3D model were introduced into the software and published on ArcGIS online. Through ArcGIS online, it is possible to share information on the web. Using a specially designed SDK, the 3D geospatial information on an ArcGIS server shared through ArcGIS online can be synchronized with a game engine. In the present work, the synchronization was executed with the Unity 3D Engine [92], specifically, version 2021.2.5f1. The SDK was used to create eight different rooms, each one corresponding to the eight different dates of recording at the fossil bearing site. As shown in Figure 8, the geographical location of the study area and the observer’s position were stated with coordinates. In addition, the height from which the user observes the area under scale (1:120) was defined, as well as the camera’s roll, pitch, and yaw. Then the relief was determined, and the Base Map was selected. Satellite mode was chosen as the base map, as the models are visualized in a photorealistic way. The geographical data, i.e., the 3D models that have retained their true coordinates, are imported into Unity 3D in layer form. The layer is literally URLs created through ArcGIS online and correspond to the 3D models of the area.
The virtual reality (VR) graphical interface was developed through Unity 3D En-gines with the support of the Steam VR Platform for experiencing VR content on an Index Valve headset. Two base stations were integrated during Valve Index equipment installation. The base stations delimit the area in which the user can move autonomously. The area’s dimensions were 3 × 3 m. Initially, two virtual scenes were designed to include the 2D and 3D maps and the 2D and 3D models that form the virtual room. The 3D orthophoto maps and models were positioned in the virtual room based on the script. Following the placement of all the objects in the virtual space, the stage lights also had to be adjusted. The scene’s lighting was performed/determined with Auto Generate Lights for the homogeneous distribution of the lighting in the room. Simulation of the user in the virtual world is a key element in the design of the VR application; thus, an avatar that imitates the player’s movements and actions is placed in the virtual space. The virtual room extended over an area of 10 × 10 m, where teleportation areas and teleportation points allowed the player to navigate the entirety of the area via the controllers.
Teleport areas and Points were placed in the multitemporal and the multiscale scene. A perimetrical collider rendered a human form to the avatar’s virtual body, so that the user could circumvent other objects in the scene. Additionally, maps triggered new actions when touched by the user. The first interaction programmed was alternation between the two scenes. This interaction was implemented with a script written in C# language, offering further properties to each panel and connecting it with the corresponding scene. This property was also utilized in a portal to allow transfer from the multitemporal to the multiscale room and vice versa. The multiscale room offers an additional action at the player’s disposal. The action implemented a laser point for transition from the 2D orthophotomaps on the panels to the corresponding 3D orthophotomaps. This is achievable through the laser pointer on the controllers, where a C# script is activated by the trigger button.

5. Results—VR 3D Mapping

The result of this study is a VR application for the spatiotemporal monitoring of the maintenance and promotional works on a fossil site of the Lesvos Geopark at three different cartographic scales. The final application was launched in .exe format for user testing. Figure 7 shows the application in its entirety and the way it operates.
In more detail, Figure 7b shows the first room where the orthophotomaps’ panels are displayed on a 1:120 cartographic scale for all recording dates. The player starts at the center of the multitemporal room viewing the Lesvos Island map, and is allowed to move voluntarily in a limited space of 3 × 3 m. The first virtual room is larger than 3 × 3 m, so a teleport area was placed at its center, along with individual teleport points. The teleport points were placed in front of the fossil site’s orthophoto maps, as a way for the users to focus on the orthophoto maps without distractions from the surrounding space. In addition, the virtual player’s body and head positions were adjusted to face the panel at a distance that allowed the entire map to cover their field of view. The maps were printed on A0 paper-size dimensions to maintain the appropriate spatial resolution (1.3 cm) for the 1:120 cartographic scale. Thus, the user can observe the changes throughout the different recording dates. Instructions on the panels offer information on the ways they can interact with them. By touching the image of the map, the user moves to a different virtual space. This space is a 3D geographic area created by the ArcGIS Maps SDK.
A total of eight rooms were created using the ArcGIS Maps SDK. There are certain limitations concerning the eight rooms that correspond to the geovisualization of different timeframes. The ArcGIS Maps SDK operates with geographical coordinates in the WGS 84 reporting system, which leads to difficulties in navigation, as the user must be placed in a position with specific azimuths and at an absolute altitude. This difficulty was addressed by using the UAS coordinates and the ω, φ, κ angles of the camera, which were used as basepoints to place the observer in the 3D virtual geographic space. In rooms with different timeframes, the observer monitors the study area with no option to change the height to keep the observation scale constant at 1:120. To exit the rooms where the ArcGIS Maps SDK is used, the B button was programmed on both controllers to lead the user to the multitemporal room. A portal activated by touch allows the transition from the multitemporal to the multiscale room. The multiscale room is shown in Figure 9a. This room contains three panels, making it smaller than the multitemporal room. The three orthophotomap scales for the panels are: (i) 1:120, (ii) 1:20, and (iii) ≥1:10. The user can move voluntarily by walking in the room or by using the teleport points in front of the maps. Interaction with orthophotomaps in this room is executed with the use of pointers. A pointer is placed on the left controller, activated through the trigger on the lower part of the handle. Aiming at each panel and pressing the trigger, the orthophotomap is replaced by the corresponding 3D model (Figure 9c,d). The models selected for the three 3D mapping scales were: (i) the entirety of the fossil-bearing site (13 October 2021), (ii) the root system, and (iii) a single petrified tree trunk. Colliders were defined in each panel and model so that they will not be penetrated by mistake and to also enable switch interaction. Figure 9c and 9b shows the user’s interaction with the 3D models. The user is able to approach the 3D models to explore and observe them from all sides and receive important data about the area and the findings, as the resolution of the 3D models and the textures is very high, accurate, and detailed. The addition of a portal similar to the one in the multitemporal room enables the transfer between the two rooms.

6. Discussion

Every stage of the presented methodology raises important remarks. In the data acquisition stage, the topography of the study area affects the design of the flight plan. This observation applies to all UAS types utilized to acquire VHRIs. In addition, the high-relief areas significantly affect the primary data as the shadows are intense. To address this issue, the flights were performed at different times of day, depending on the season of the year. At this point it is worth mentioning that the use of different UASs is feasible for spatiotemporal monitoring at a constant mapping scale if the corresponding flight altitude is calculated for each recording sensor.
The results of the image-based 3D modeling process highlight the significance of the GSD of VHRIs for the orthophotomaps and the 3D models’ reconstruction. More specifically, data acquired at a flight altitude of 40–60 m produce orthophotomaps and 3D models at a scale of up to 1:120. When the orthophotomaps and the 3D models of the fossil site are imported in a VR environment, close observation is inadequate for the accurate and precise depiction of the root system and the single located fossil. For this reason, it is crucial for the flight characteristics of the UAS to adapt to the mapping scale, and the geometry of the root system and the fossil. The root system was 3D mapped at a flight height of 15 m, providing an orthophotomap at a cartographic scale of 1:20. This scale allows the user to observe the root system in its entirety, and to distinguish root geometry without losing details. This provides an enhanced understanding of the 3D representation of a fossilized root system and aids the management board in observing the finding’s pathogenesis to maintain it accordingly. For the high-resolution 3D mapping of the individual fossil and for its geovisualization in VR, VHRIs must be acquired at a very low flight altitude (≤5 m). The 1:10 mapping scale in the virtual space thoroughly provides the fossil’s geometry and texture. The geovisualization of the petrified tree trunk is thus both impressive and helpful, as every detail can be observed.
In the VR geovisualization stage for the integration of game engines, GIS and UAS are performed. This process requires high computational power and knowledgeability in the use of corresponding software and hardware. The VR application was designed to allow the user to monitor the maintenance and development processes of the fossilite ferrous site on various timeframes and at different cartographic scales. This required the import of 15 photogrammetric models to the same project. The difficulties encountered were related to the management of a large number of 3D models as well as the maintenance of their photorealistic texture. Therefore, the area rendered in 3D was carefully selected by only cropping the area of interest and removing the faces of the model depicting the surrounding area. This procedure was applied to all 3D models used in the VR application. For the lighting of the scenes, natural directional lighting was created. The lights’ direction was set at a 90° angle perpendicular to the rooms to eliminate shadows that affect the feeling of the space and the texture and colors of the 3D models. In addition, the rooms created using the ArcGIS Map SDK offered relatively limited interaction, as this plugin has been recently released and not all possibilities of connecting GIS to VR have been thoroughly researched.
Regarding the use of the VR application developed, some conclusions were drawn in terms of its functionality. It was initially observed that users who were not accustomed to VR equipment encountered difficulties in navigation. For example, users who are unfamiliar with VR hesitate with their movements in the virtual space. Furthermore, the simulation of the movements and interactions through the controllers was initially incomprehensible, as novice users were not accustomed to the use of this VR equipment. For this reason, it is recommended that users be instructed on the operation of VR equipment (headset and controllers) prior to the virtual tour. Practicing with the equipment does not require a lot of time, as it takes users just a few minutes to familiarize themselves with the use of the mask and the virtual space and to then learn how the actions and buttons operate in this application. The total training time ranges from 10 to 20 min, depending on the user’s previous experiences with VR applications.

7. Conclusions

The main conclusion that can be drawn from the proposed methodology and the derivatives of the present study is that cartographic scale and data acquisition with UASs are crucial for the efficient visualization of geospatial information in VR. The cartographic derivatives from UAS data can be successfully geovisualized at the proper scale in a VR environment. Regarding the multitemporal monitoring of the maintenance and figuration of the fossilite ferrous site, VR geovisualization has many advantages such as: (i) the simultaneous observation of the fossil site at different times, (ii) the full utilization of the high resolution in the cartographic results, and (iii) the interactive conveyance of the deduced information.
The broad contribution of the present research is that the flight characteristics (flight altitude, flight path, and angle of the camera) for acquiring VHRIs strongly influence the quality of derived 2D maps and 3D models. Three mapping scales were required for high-resolution 2D and 3D mapping and the VR geovisualization of the fossilite ferrous site. The number of scales is associated with the the levels at which the spatiotemporal monitoring of the site’s formation process were performed: (i) the fossil site (1:120), (ii) the fossil root system (1:20), and (iii) individual fossils (≥1:10). Therefore, for their accurate and precise 3D mapping and their efficient (optimal) 3D VR geovisualization, the observance of cartographic principles and scale issues is of key importance. Therefore, the method of UAS data collection and the spatial resolution of VHRIs are determined by the cartographic scale. Overall, our results demonstrate a strong cartographic scale effect in the development of VR geovisualization and confirm that the use of 2D and 3D maps in VR offers new possibilities in the field of cartography.
To evaluate the usability of VR 3D mapping, a future goal is the conducting of further game-crowdsourcing experiences. The participants will appraise the qualitative and quantitative aspects of the application. Their recommendation will determine the advantages and disadvantages regarding the immersive experience. Future research will continue to explore the utilization of 2D and 3D maps in VR. More specifically, one of our future goals is to develop a protocol for UAS data acquisition in order to create high resolution 2D and 3D maps in VR. Furthermore, another research subject to be investigated is the transfer of thematic information and its efficient visualization on future 2D and 3D maps, as well as the combination of these two cartographic results in VR.

Author Contributions

Conceptualization, E.-E.P., A.P., N.-A.K., N.Z. and N.S.; methodology, E.-E.P., N.-A.K. and N.S.; software, E.-E.P.; writing—original draft preparation, E.-E.P., A.P., N.-A.K. and N.S.; writing—review and editing, E.-E.P., A.P., N.-A.K., N.Z. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Research e-Infrastructure “Interregional Digital Trans- formation for Culture and Tourism in Aegean Archipelagos” {Code Number MIS 5047046} which is implemented within the framework of the “Regional Excellence” Action of the Operational Program “Competitiveness, Entrepreneurship and Innovation”. The action was co-funded by the European Regional Development Fund (ERDF) and the Greek State [Partnership Agreement 2014–2020].

Data Availability Statement

Not applicable.

Acknowledgments

We thank the editor and the three anonymous reviewers for their insightful comments which substantially improved the manuscript. We also thank Vlasios Kasapakis for his help with VR programming and Aikaterini Rippi for her help with English language editing.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Budiharto, W.; Irwansyah, E.; Suroso, J.S.; Chowanda, A.; Ngarianto, H.; Gunawan, A.A.S. Mapping and 3D modelling using quadrotor drone and GIS software. J. Big Data 2021, 8, 48. [Google Scholar] [CrossRef]
  2. James, M.R.; Robson, S.; Smith, M.W. 3-D uncertainty-based topographic change detection with structure-from-motion photogrammetry: Precision maps for ground control and directly georeferenced surveys. Earth Surf. Processes Landf. 2017, 42, 1769–1788. [Google Scholar] [CrossRef]
  3. Cook, K.L. An evaluation of the effectiveness of low-cost UAVs and structure from motion for geomorphic change detection. Geomorphology 2017, 278, 195–208. [Google Scholar] [CrossRef]
  4. Shahbazi, M.; Sohn, G.; Théau, J.; Menard, P. Development and evaluation of a UAV-photogrammetry system for precise 3D environmental modeling. Sensors 2015, 15, 27493–27524. [Google Scholar] [CrossRef]
  5. Papadopoulou, E.; Papakonstantinou, A.; Zouros, N.; Soulakellis, N. Scale-Variant Flight Planning for the Creation of 3D Geovisualization and Augmented Reality Maps of Geosites: The Case of Voulgaris Gorge, Lesvos, Greece. Appl. Sci. 2021, 11, 10733. [Google Scholar] [CrossRef]
  6. Topouzelis, K.; Papakonstantinou, A.; Doukari, M. Coastline change detection using unmanned aerial vehicles and image processing. Fresenius Environ. Bull. 2017, 26, 5564–5571. [Google Scholar]
  7. Soulakellis, N.; Tataris, G.; Papadopoulou, E.; Chatzistamatis, S.; Vasilakos, C.; Kavroudakis, D.; Roussou, O.; Papakonstantinou, A. Synergistic Exploitation of Geoinformation Methods for Post-earthquake 3D Mapping and Damage Assessment. In Intelligent Systems for Crisis Management; Altan, O., Chandra, M., Sunar, F., Tanzi, T.J., Eds.; Lecture Notes in Geoinformation and Cartography; Springer International Publishing: Cham, Switzerland, 2019; pp. 3–31. ISBN 978-3-030-05329-1. [Google Scholar]
  8. Papakonstantinou, A.; Topouzelis, K.; Pavlogeorgatos, G. Coastline Zones Identification and 3D Coastal Mapping Using UAV Spatial Data. ISPRS Int. J. Geo-Inf. 2016, 5, 75. [Google Scholar] [CrossRef]
  9. Turner, I.L.; Harley, M.D.; Drummond, C.D. UAVs for coastal surveying. Coast. Eng. 2016, 114, 19–24. [Google Scholar] [CrossRef]
  10. Madden, M.; Jordan, T.; Cotten, D.; Hare, N.; Pascua, A.; Bernardes, S. The future of Unmanned Aerial Systems (UAS) for monitoring natural and culture resources. In Proceedings of the Photogrammetric Week, Stuttgart, Germany, 7–11 September 2015. [Google Scholar]
  11. Śledź, S.; Ewertowski, M.W.; Piekarczyk, J. Applications of unmanned aerial vehicle (UAV) surveys and Structure from Motion photogrammetry in glacial and periglacial geomorphology. Geomorphology 2021, 378, 107620. [Google Scholar] [CrossRef]
  12. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  13. Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of Unmanned Aerial Vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606. [Google Scholar] [CrossRef] [Green Version]
  14. Forlani, G.; Dall’Asta, E.; Diotri, F.; di Cella, U.M.; Roncella, R.; Santise, M. Quality assessment of DSMs produced from UAV flights georeferenced with on-board RTK positioning. Remote Sens. 2018, 10, 311. [Google Scholar] [CrossRef]
  15. Papakonstantinou, A.; Stamati, C.; Topouzelis, K. Comparison of True-Color and Multispectral Unmanned Aerial Systems Imagery for Marine Habitat Mapping Using Object-Based Image Analysis. Remote Sens. 2020, 12, 554. [Google Scholar] [CrossRef]
  16. Siebert, S.; Teizer, J. Automation in Construction Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Autom. Constr. 2014, 41, 1–14. [Google Scholar] [CrossRef]
  17. Soulakellis, N.; Vasilakos, C.; Chatzistamatis, S.; Kavroudakis, D.; Tataris, G.; Papadopoulou, E.E.; Papakonstantinou, A.; Roussou, O.; Kontos, T. Post-earthquake recovery phase monitoring and mapping based on UAS data. ISPRS Int. J. Geo-Inf. 2020, 9, 447. [Google Scholar] [CrossRef]
  18. Addink, E.A.; Van Coillie, F.M.B.; De Jong, S.M. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis. Int. J. Appl. Earth Obs. Geoinf. 2012, 15, 1–6. [Google Scholar] [CrossRef]
  19. Su, L.; Gibeaut, J. Using UAS Hyperspatial RGB Imagery for Identifying Beach Zones along the South Texas Coast. Remote Sens. 2017, 9, 159. [Google Scholar] [CrossRef]
  20. Bemis, S.P.; Micklethwaite, S.; Turner, D.; James, M.R.; Akciz, S.; Thiele, S.T.; Bangash, H.A. Ground-based and UAV-Based photogrammetry: A multi-scale, high-resolution mapping tool for structural geology and paleoseismology. J. Struct. Geol. 2014, 69, 163–178. [Google Scholar] [CrossRef]
  21. Pajares, G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–329. [Google Scholar] [CrossRef]
  22. Yastikli, N.; Bagci, I.; Beser, C. The Processing of Image Data Collected by Light UAV Systems for GIS Data Capture and Updating. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-7/W2, 267–270. [Google Scholar] [CrossRef]
  23. Gonçalves, J.A.A.; Henriques, R. UAV photogrammetry for topographic monitoring of coastal areas. ISPRS J. Photogramm. Remote Sens. 2015, 104, 101–111. [Google Scholar] [CrossRef]
  24. Clapuyt, F.; Vanacker, V.; Van Oost, K. Reproducibility of UAV-based earth topography reconstructions based on Structure-from-Motion algorithms. Geomorphology 2016, 260, 4–15. [Google Scholar] [CrossRef]
  25. Brunier, G.; Fleury, J.; Anthony, E.J.; Gardel, A.; Dussouillez, P. Close-range airborne Structure-from-Motion Photogrammetry for high-resolution beach morphometric surveys: Examples from an embayed rotating beach. Geomorphology 2016, 261, 76–88. [Google Scholar] [CrossRef]
  26. Papadopoulou, E.E.; Kasapakis, V.; Vasilakos, C.; Papakonstantinou, A.; Zouros, N.; Chroni, A.; Soulakellis, N. Geovisualization of the excavation process in the Lesvos petrified forest, Greece using augmented reality. ISPRS Int. J. Geo-Inf. 2020, 9, 374. [Google Scholar] [CrossRef]
  27. Fallati, L.; Polidori, A.; Salvatore, C.; Saponari, L.; Savini, A.; Galli, P. Anthropogenic Marine Debris assessment with Unmanned Aerial Vehicle imagery and deep learning: A case study along the beaches of the Republic of Maldives. Sci. Total Environ. 2019, 693, 133581. [Google Scholar] [CrossRef]
  28. Klemas, V.V. Coastal and Environmental Remote Sensing from Unmanned Aerial Vehicles: An Overview. J. Coast. Res. 2015, 315, 1260–1267. [Google Scholar] [CrossRef]
  29. Ventura, D.; Bonifazi, A.; Gravina, M.F.; Ardizzone, G.D. Unmanned Aerial Systems (UASs) for Environmental Monitoring: A Review with Applications in Coastal Habitats. In Aerial Robots-Aerodynamics, Control and Applications; InTech: London, UK, 2017. [Google Scholar]
  30. Gomez, C.; Purdie, H. UAV-based Photogrammetry and Geocomputing for Hazards and Disaster Risk Monitoring—A Review. Geoenvironment. Disasters 2016, 3, 23. [Google Scholar] [CrossRef]
  31. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  32. Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral remote sensing from unmanned aircraft: Image processing workflows and applications for rangeland environments. Remote Sens. 2011, 3, 2529–2551. [Google Scholar] [CrossRef]
  33. Papakonstantinou, A.; Kavroudakis, D.; Kourtzellis, Y.; Chtenellis, M.; Kopsachilis, V.; Topouzelis, K.; Vaitis, M. Mapping Cultural Heritage in Coastal Areas with UAS: The Case Study of Lesvos Island. Heritage 2019, 2, 1404–1422. [Google Scholar] [CrossRef]
  34. Maceachren, A.M.; Kraak, M.J. Exploratory cartographic visualization advancing the agenda. Comput. Geosci. 1997, 23, 335–343. [Google Scholar] [CrossRef]
  35. Marzouki, A.; Lafrance, F.; Daniel, S.; Mellouli, S. The relevance of geovisualization in Citizen Participation processes. In Proceedings of the 18th Annual International Conference on Digital Government Research, Staten Island, NY, USA, 7–9 June 2017; pp. 397–406. [Google Scholar] [CrossRef]
  36. Cartwright, W.; Crampton, J.; Gartner, G.; Miller, S.; Mitchell, K.; Siekierska, E.; Wood, J. Geospatial information visualization user interface issues. Cartogr. Geogr. Inf. Sci. 2000, 28, 45–60. [Google Scholar] [CrossRef]
  37. Ruzinoor, C.M.; Shariff, A.R.M.; Pradhan, B.; Rodzi Ahmad, M.; Rahim, M.S.M. A review on 3D terrain visualization of GIS data: Techniques and software. Geo-Spat. Inf. Sci. 2012, 15, 105–115. [Google Scholar] [CrossRef]
  38. Cartwright, W.; Peterson, M.P. Multimedia Cartography, 1st ed.; Springer: Berlin/Heidelberg, Germany; pp. 1–10.
  39. Bleisch, S. 3D geovisualization–definition and structures for the assessment of usefulness. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 1, 129–134. [Google Scholar] [CrossRef]
  40. Chen, P. Visualization of real-time monitoring datagraphic of urban environmental quality. Eurasip J. Image Video Process. 2019, 2019, 42. [Google Scholar] [CrossRef]
  41. Herman, L.; Juřík, V.; Snopková, D.; Chmelík, J.; Ugwitz, P.; Stachoň, Z.; Šašinka, Č.; Řezník, T. A comparison of monoscopic and stereoscopic 3d visualizations: Effect on spatial planning in digital twins. Remote Sens. 2021, 13, 2976. [Google Scholar] [CrossRef]
  42. Ficarra, B. Virtual reality, augmented reality, and mixed reality. In Emerging Technologies for Nurses: Implications for Practice; Springer: New York, NY, USA, 2020; pp. 95–126. [Google Scholar] [CrossRef]
  43. Carbonell-Carrera, C.; Saorin, J.L.; Díaz, D.M. User VR experience and motivation study in an immersive 3D geovisualization environment using a game engine for landscape design teaching. Land 2021, 10, 492. [Google Scholar] [CrossRef]
  44. Azmi, A.; Ibrahim, R.; Abdul Ghafar, M.; Rashidi, A. Smarter real estate marketing using virtual reality to influence potential homebuyers’ emotions and purchase intention. Smart Sustain. Built Environ. 2021. [Google Scholar] [CrossRef]
  45. Deaky, B.A.; Parv, A.L. Virtual Reality for Real Estate-A case study. In Proceedings of the IOP Conference Series: Materials Science and Engineering, Brasov, Romania, 24–27 April 2018; Volume 399. [Google Scholar] [CrossRef]
  46. Markopoulos, P. THESIS Simulating An Exciting Game Experience within Virtual Reality. Master’s Thesis, University of Turku, Turku, Finland, 2020. [Google Scholar] [CrossRef]
  47. Caciora, T.; Herman, G.V.; Ilieş, A.; Baias, Ş.; Ilieş, D.C.; Josan, I.; Hodor, N. The use of virtual reality to promote sustainable tourism: A case study of wooden churches historical monuments from Romania. Remote Sens. 2021, 13, 1758. [Google Scholar] [CrossRef]
  48. Edler, D.; Kühne, O.; Keil, J.; Dickmann, F. Audiovisual Cartography: Established and New Multimedia Approaches to Represent Soundscapes. KN-J. Cartogr. Geogr. Inf. 2019, 69, 5–17. [Google Scholar] [CrossRef]
  49. Pasquaré Mariotto, F.; Antoniou, V.; Drymoni, K.; Bonali, F.L.; Nomikou, P.; Fallati, L.; Karatzaferis, O.; Vlasopoulos, O. Virtual geosite communication through a webgis platform: A case study from santorini island (Greece). Appl. Sci. 2021, 11, 5466. [Google Scholar] [CrossRef]
  50. Hruby, F.; Ressl, R.; de la Borbolla del Valle, G. Geovisualization with immersive virtual environments in theory and practice. Int. J. Digit. Earth 2019, 12, 123–136. [Google Scholar] [CrossRef]
  51. Yang, Y.; Jenny, B.; Dwyer, T.; Marriott, K.; Chen, H.; Cordeil, M. Maps and Globes in Virtual Reality. Comput. Graph. Forum. 2018, 37, 427–438. [Google Scholar] [CrossRef]
  52. Evangelidis, K.; Sylaiou, S.; Papadopoulos, T. Mergin’mode: Mixed reality and geoinformatics for monument demonstration. Appl. Sci. 2020, 10, 3826. [Google Scholar] [CrossRef]
  53. Criscuolo, L.; Bordogna, G.; Carrara, P.; Pepe, M. CS projects involving geoinformatics: A survey of implementation approaches. ISPRS Int. J. Geo-Inf. 2018, 7, 312. [Google Scholar] [CrossRef]
  54. Çöltekin, A.; Lochhead, I.; Madden, M.; Christophe, S.; Devaux, A.; Pettit, C.; Lock, O.; Shukla, S.; Herman, L.; Stachoň, Z.; et al. Extended reality in spatial sciences: A review of research challenges and future directions. ISPRS Int. J. Geo-Inf. 2020, 9, 439. [Google Scholar] [CrossRef]
  55. Wang, S.; Mao, Z.; Zeng, C.; Gong, H.; Li, S.; Chen, B. A new method of virtual reality based on unity3D. In Proceedings of the 2010 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010; pp. 2–6. [Google Scholar] [CrossRef]
  56. Xiao, W.; Mills, J.; Guidi, G.; Rodríguez-Gonzálvez, P.; Gonizzi Barsanti, S.; González-Aguilera, D. Geoinformatics for the conservation and promotion of cultural heritage in support of the UN Sustainable Development Goals. ISPRS J. Photogramm. Remote Sens. 2018, 142, 389–406. [Google Scholar] [CrossRef]
  57. Stachoň, Z.; Kubicek, P.; Málek, F.; Krejčí, M.; Herman, L. The Role of Hue and Realism in Virtual Reality. In Proceedings of the 7th International Conference on Cartography and GIS, Sozopol, Bulga, 18–23 June 20; pp. 18–23.
  58. Zhang, X.; Zhao, P.; Hu, Q.; Ai, M.; Hu, D.; Li, J. A UAV-based panoramic oblique photogrammetry (POP) approach using spherical projection. ISPRS J. Photogramm. Remote Sens. 2020, 159, 198–219. [Google Scholar] [CrossRef]
  59. Kersten, T.P.; Edler, D. Special Issue “Methods and Applications of Virtual and Augmented Reality in Geo-Information Sciences”. PFG-J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 119–120. [Google Scholar] [CrossRef]
  60. Virtanen, J.P.; Julin, A.; Handolin, H.; Rantanen, T.; Maksimainen, M.; Hyyppä, J.; Hyyppä, H. Interactive geo-information in virtual reality-observations and future challenges. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2020, 44, 159–165. [Google Scholar] [CrossRef]
  61. Khalili, A. An XML-based approach for geo-semantic data exchange from BIM to VR applications. Autom. Constr. 2021, 121, 103425. [Google Scholar] [CrossRef]
  62. Templin, T.; Popielarczyk, D.; Gryszko, M. Using Augmented and Virtual Reality (AR/VR) to Support Safe Navigation on Inland and Coastal Water Zones. Remote Sens. 2022, 14, 1520. [Google Scholar] [CrossRef]
  63. Regolini-Bissig, G.; Reynard, E. Mapping Geoheritage_6 Papers; Geovisions, 127; Institut de Géographie: Paris, France, 2010. [Google Scholar]
  64. Dong, W.; Yang, T.; Liao, H.; Meng, L. How does map use differ in virtual reality and desktop-based environments? Int. J. Digit. Earth 2020, 13, 1484–1503. [Google Scholar] [CrossRef]
  65. Martin, S.; Reynard, E.; Pellitero Ondicol, R.; Ghiraldi, L. Multi-scale Web Mapping for Geoheritage Visualisation and Promotion. Geoheritage 2014, 6, 141–148. [Google Scholar] [CrossRef]
  66. Kalacska, M.; Arroyo-Mora, J.P.; Lucanus, O. Comparing uas lidar and structure-from-motion photogrammetry for peatland mapping and virtual reality (Vr) visualization. Drones 2021, 5, 36. [Google Scholar] [CrossRef]
  67. Spero, H.R.; Vazquez-Lopez, I.; Miller, K.; Joshaghani, R.; Cutchin, S.; Enterkine, J. Drones, virtual reality, and modeling: Communicating catastrophic dam failure. Int. J. Digit. Earth 2022, 15, 585–605. [Google Scholar] [CrossRef]
  68. Gerloni, I.G.; Carchiolo, V.; Vitello, F.R.; Sciacca, E.; Becciani, U.; Costa, A.; Riggi, S.; Bonali, F.L.; Russo, E.; Fallati, L.; et al. Immersive virtual reality for earth sciences. In Proceedings of the 2018 Federated Conference on Computer Science and Information Systems (FedCSIS), Poznan, Poland, 9–12 September 2018; Volume 15, pp. 527–534. [Google Scholar] [CrossRef]
  69. Havenith, H.-B. 3D Landslide Models in VR. In Understanding and Reducing Landslide Disaster Risk: Volume 4 Testing, Modeling and Risk Assessment; Tiwari, B., Sassa, K., Bobrowsky, P.T., Takara, K., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 195–204. ISBN 978-3-030-60706-7. [Google Scholar]
  70. Wang, J.; Zouros, N. Educational Activities in Fangshan UNESCO Global Geopark and Lesvos Island UNESCO Global Geopark. Geoheritage 2021, 13, 51. [Google Scholar] [CrossRef]
  71. Zouros, N.; Velitzelos, E.; Valiakos, E.; Ververis, K. Submarine petrified forest in Lesvos Greece. In Proceedings of the 5th International Symposium on Eastern Mediterranean Geology, Thessaloniki, Greece, 14–20 April 2004; pp. 3–6. [Google Scholar]
  72. Zouros, N. The European Geoparks Network. Episodes 2004, 27, 165–171. [Google Scholar] [CrossRef]
  73. Zouros, N. Geodiversity and Sustainable Development: Geoparks—A New Challenge for Research and Education in Earth Sciences. Bull. Geol. Soc. Greece 2017, 43, 159. [Google Scholar] [CrossRef]
  74. Metashape, A. AgiSoft Metashape Professional Edition; Version 1.9.1.; Agisoft LLC: St. Petersburg, Russia.
  75. ArcGIS Pro, version 3.0.0; Software. Desktop; Esri Inc.: Redlands, CA, USA, 2022.
  76. Esri. What Is ArcGIS Enterprise? Available online: https://enterprise.arcgis.com/en/get-started/10.7/windows/what-is-arcgis-enterprise-.htm (accessed on 14 June 2021).
  77. Valve Software. SteamVR, version 2.00; Valve Software: Bellevue, WA, USA, 2020; Available online: https://partner.steamgames.com/doc/features/steamvr/ (accessed on 6 July 2022).
  78. Litchi, M.H. Litchi Mission Hub Flight planner. Available online: flylitchi.com/hub (accessed on 1 January 2018).
  79. DJI Pilot Flight Control App; Matrice 300 RTK: Pilot PE v1.8.0.; DJI: Shenzhen, China, 2021.
  80. Abdalla, R. Introduction to Geospatial Information and Communication Technology (GeoICT); Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar]
  81. Whitehead, K.; Hugenholtz, C.H. Applying ASPRS accuracy standards to surveys from small unmanned aircraft systems (UAS). Photogramm. Eng. Remote Sens. 2015, 81, 787–793. [Google Scholar] [CrossRef]
  82. Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal ProcessingLett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
  83. Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. “Structure-from-Motion” photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
  84. Anderson, K.; Westoby, M.J.; James, M.R. Low-budget topographic surveying comes of age: Structure from motion photogrammetry in geography and the geosciences. Prog. Phys. Geogr. Earth Environ. 2019, 43, 163–173. [Google Scholar] [CrossRef]
  85. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  86. Tao, Y.; Xia, Y.; Xu, T.; Chi, X. Research progress of the scale invariant feature transform (SIFT) descriptors. J. Converg. Inf. Technol. 2010, 5, 116–121. [Google Scholar] [CrossRef] [Green Version]
  87. Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  88. Yang, M.Y.; Förstner, W.; Photogrammetry, D. Plane Detection in Point Cloud Data; Technical Report; TR-IGG-P-2010-01; University of Bonn: Bonn, Germany, 2010. [Google Scholar]
  89. Smith, M.W.; Carrivick, J.L.; Quincey, D.J. Structure from motion photogrammetry in physical geography. Prog. Phys. Geogr. 2015, 40, 247–275. [Google Scholar] [CrossRef]
  90. Gavalas, D.; Kasapakis, V.; Konstantopoulos, C.; Pantziou, G.; Vathis, N.; Zaroliagis, C. The eCOMPASS multimodal tourist tour planner. Expert Syst. Appl. 2015, 42, 7303–7316. [Google Scholar] [CrossRef]
  91. Kasapakis, V.; Gavalas, D. Pervasive gaming: Status, trends and design principles. J. Netw. Comput. Appl. 2015, 55, 213–236. [Google Scholar] [CrossRef]
  92. Unity 3D (Game Engine). 2021. Available online: http://www.unity3d.com/ (accessed on 1 December 2021).
Figure 1. Location of the fossilite ferrous site, Lesvos Petrified Forest, Lesvos, Greece (Sources: Esri, HERE, Garmin, FAO, METI/NASA, USGS).
Figure 1. Location of the fossilite ferrous site, Lesvos Petrified Forest, Lesvos, Greece (Sources: Esri, HERE, Garmin, FAO, METI/NASA, USGS).
Remotesensing 14 04259 g001
Figure 2. Flowchart of the main stages in the methodology.
Figure 2. Flowchart of the main stages in the methodology.
Remotesensing 14 04259 g002
Figure 3. Flight plans for: (a) wider study area, (b) fossil-bearing site, (c) root system, and (d) single fossils.
Figure 3. Flight plans for: (a) wider study area, (b) fossil-bearing site, (c) root system, and (d) single fossils.
Remotesensing 14 04259 g003
Figure 4. Map with the measurements’ positions of the ground control points and the checkpoints.
Figure 4. Map with the measurements’ positions of the ground control points and the checkpoints.
Remotesensing 14 04259 g004
Figure 5. Comparison of point clouds of the petrified root system: (a) cloud to cloud distance, (b) surface density of RS-1, (c) surface density of RS-2, (d) RGB point cloud of RS-1 and (e) RGB point cloud of RS-2.
Figure 5. Comparison of point clouds of the petrified root system: (a) cloud to cloud distance, (b) surface density of RS-1, (c) surface density of RS-2, (d) RGB point cloud of RS-1 and (e) RGB point cloud of RS-2.
Remotesensing 14 04259 g005
Figure 6. Point clouds of the fossilized trunk comparison: (a) cloud to cloud distance, (b) surface density of PPT-1, (c) surface density of PPT-2, (d) RGB point cloud of PPT-1, and (e) RGB point cloud of PPT-2.4. VR Geovisualization.
Figure 6. Point clouds of the fossilized trunk comparison: (a) cloud to cloud distance, (b) surface density of PPT-1, (c) surface density of PPT-2, (d) RGB point cloud of PPT-1, and (e) RGB point cloud of PPT-2.4. VR Geovisualization.
Remotesensing 14 04259 g006
Figure 7. Structure of rooms/scenes in the VR application.
Figure 7. Structure of rooms/scenes in the VR application.
Remotesensing 14 04259 g007
Figure 8. ArcGIS Maps SDK in Unity 3D Engine and the preview of fossilite ferrous site on 11 July 2018.
Figure 8. ArcGIS Maps SDK in Unity 3D Engine and the preview of fossilite ferrous site on 11 July 2018.
Remotesensing 14 04259 g008
Figure 9. VR application: (a) multiscale room in Unity 3D Engine, (b) multitemporal room, panels, teleport area, and teleport points, (c) a user of the VR application, and (d) the corresponding image projected to the user in the VR headset via SteamVR.
Figure 9. VR application: (a) multiscale room in Unity 3D Engine, (b) multitemporal room, panels, teleport area, and teleport points, (c) a user of the VR application, and (d) the corresponding image projected to the user in the VR headset via SteamVR.
Remotesensing 14 04259 g009
Table 1. Characteristics of UASs and cameras used in image acquisition.
Table 1. Characteristics of UASs and cameras used in image acquisition.
UASs/CamerasMegapixelAperturefr (mm) Equivalent fr (mm) Actualiw (Pixel)sw (mm)
Phantom 4 Pro20f/2.8–f/11248.85472 × 364813.2 × 8.8
Mavic Pro Hasselblad20f/2.8–f/1128115472 × 364813.2 × 8.8
Inspire (Zenmuse X5S)20.8 5280 × 395613 × 17.3
DJI MFTOlympusf/1.7f/1.81525
Matrice 300 (Zenmuse P1)48f/2.8–f/16 358192 × 546035.9 × 24
Table 2. Recording dates, UAS type, cartographic scale, corresponding GSD, and flight altitude.
Table 2. Recording dates, UAS type, cartographic scale, corresponding GSD, and flight altitude.
Date-Month-YearUASRecording SensorScaleGSDHeight of Flight
11/07/2018Phantom 4 ProDJI 20 MP (9 mm)1:1200.8 cm (<1.3 cm)30 m
13/03/2021Phantom 4 ProDJI 20 MP (9 mm)1:1201.07 cm (<1.3 cm)40 m
08/05/2021Inspire 2Zenmuse X5 S (25 mm)1:1200.5 cm (<1.3 cm)50 m
08/05/2021Inspire 2Zenmuse X5 S (25 mm)1:200.16 cm (<0.20 cm)15 m
06/07/2021Inspire 2Zenmuse X5 S (15 mm)1:1200.9 cm (<1.3 cm)40 m
21/07/2021Mavic 2 ProHasselblad (11 mm)1:1200.16 cm (<1.3 cm)40 m
11/09/2021Mavic 2 ProHasselblad (11 mm)1:1200.16 cm (<1.3 cm)40 m
11/09/2021Inspire 2Zenmuse X5 S (25 mm)≥1:100.1 cm (≤0.1 cm)5 m
13/10/2021Inspire 2Zenmuse X5 S (15 mm)1:1200.9 cm (<1.3 cm)40 m
15/05/2022Matrice 300Zenmuse P1 (35 mm)1:1200.5 cm (<1.3 cm)50 m
Table 3. Georeference error table on x, y, z, and total RMS for all datasets collected on eight recording dates.
Table 3. Georeference error table on x, y, z, and total RMS for all datasets collected on eight recording dates.
Date-Month-YearX Error (cm)Y Error (cm)Z Error (cm)Total RMS (cm)
11/07/20180.400.931.411.74
13/03/20212.48 1.820.523.12
08/05/20212.421.240.242.73
06/07/20212.022.780.843.54
21/07/20212.662.051.063.53
11/09/20212.44 2.111.033.38
13/10/20212.60 3.010.804.06
15/05/20222.592.971.14.01
Table 4. Indicative results of image-based 3D modeling processing: (A) 3D models, (B) orthophoto maps, and (C) recording dates.
Table 4. Indicative results of image-based 3D modeling processing: (A) 3D models, (B) orthophoto maps, and (C) recording dates.
A. 3D modelsB. OrthophotomapsC. Date
Remotesensing 14 04259 i001Remotesensing 14 04259 i00211/07/2018
(1:120)
Remotesensing 14 04259 i003Remotesensing 14 04259 i00415/05/2022
(1:120)
Remotesensing 14 04259 i005Remotesensing 14 04259 i00608/05/2021
(1:20)
Remotesensing 14 04259 i007Remotesensing 14 04259 i00811/09/2021
(1:10)
Table 5. Number of points of 3D point clouds and number of surfaces of 3D meshes for each recording date and the corresponding cartographic scale.
Table 5. Number of points of 3D point clouds and number of surfaces of 3D meshes for each recording date and the corresponding cartographic scale.
DateScalePoint CloudFaces
11/07/20181:12031,438,3684,978,312
13/03/20211:12020,729,5343,213,255
08/05/20211:12023,517,3124,115,665
08/05/20211:206,354,5962,644,784
06/07/20211:12014,422,0862,011,447
21/07/20211:1206,065,7842,856,779
11/09/20211:12014,572,9813,335,834
11/09/20211:1014,195,1134,947,022
13/10/20211:12023,302,5242,525,060
15/05/20221:12033,875,3136,273,056
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Papadopoulou, E.-E.; Papakonstantinou, A.; Kapogianni, N.-A.; Zouros, N.; Soulakellis, N. VR Multiscale Geovisualization Based on UAS Multitemporal Data: The Case of Geological Monuments. Remote Sens. 2022, 14, 4259. https://doi.org/10.3390/rs14174259

AMA Style

Papadopoulou E-E, Papakonstantinou A, Kapogianni N-A, Zouros N, Soulakellis N. VR Multiscale Geovisualization Based on UAS Multitemporal Data: The Case of Geological Monuments. Remote Sensing. 2022; 14(17):4259. https://doi.org/10.3390/rs14174259

Chicago/Turabian Style

Papadopoulou, Ermioni-Eirini, Apostolos Papakonstantinou, Nikoletta-Anna Kapogianni, Nikolaos Zouros, and Nikolaos Soulakellis. 2022. "VR Multiscale Geovisualization Based on UAS Multitemporal Data: The Case of Geological Monuments" Remote Sensing 14, no. 17: 4259. https://doi.org/10.3390/rs14174259

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop