Next Article in Journal
An Integrated Framework for Landscape Indices’ Calculation with Raster–Vector Integration and Its Application Based on QGIS
Previous Article in Journal
Layout Optimization of Logistics and Warehouse Land Based on a Multi-Objective Genetic Algorithm—Taking Wuhan City as an Example
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Using Virtual and Augmented Reality with GIS Data

Department of Geomatics, Faculty of Civil Engineering, Czech Technical University in Prague, 16629 Prague, Czech Republic
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2024, 13(7), 241; https://doi.org/10.3390/ijgi13070241
Submission received: 13 February 2024 / Revised: 3 June 2024 / Accepted: 26 June 2024 / Published: 5 July 2024

Abstract

:
This study explores how combining virtual reality (VR) and augmented reality (AR) with geographic information systems (GIS) revolutionizes data visualization. It traces the historical development of these technologies and highlights key milestones that paved the way for this study’s objectives. While existing platforms like Esri’s software and Google Earth VR show promise, they lack complete integration for immersive GIS visualization. This gap has led to the need for a dedicated workflow to integrate selected GIS data into a game engine for visualization purposes. This study primarily utilizes QGIS for data preparation and Unreal Engine for immersive visualization. QGIS handles data management, while Unreal Engine offers advanced rendering and interactivity for immersive experiences. To tackle the challenge of handling extensive GIS datasets, this study proposes a workflow involving tiling, digital elevation model generation, and transforming GeoTIFF data into 3D objects. Leveraging QGIS and Three.js streamlines the conversion process for integration into Unreal Engine. The resultant virtual reality application features distinct stations, enabling users to navigate, visualize, compare, and animate GIS data effectively. Each station caters to specific functionalities, ensuring a seamless and informative experience within the VR environment. This study also delves into augmented reality applications, adapting methodologies to address hardware limitations for smoother user experiences. By optimizing textures and implementing augmented reality functionalities through modules Swift, RealityKit, and ARKit, this study extends the immersive GIS experience to iOS devices. In conclusion, this research demonstrates the potential of integrating virtual reality, augmented reality, and GIS, pushing data visualization into new realms. The innovative workflows and applications developed serve as a testament to the evolving landscape of spatial data interpretation and engagement.

1. Introduction

Virtual (VR) and augmented reality (AR) are undeniably modern technologies that allow visualization in the 3D world in high quality, and their development in recent years has also seen software and game engines that are increasingly advanced and support more and more inputs. From this point of view, the benefits of VR and AR are unquestionable. These technologies offer users a new way of looking at previously known data. The technologies provide immersion, interactivity, and entertainment, especially for younger generations. For VR or AR, you need to have advanced technology in the form of hardware and, of course, data. The hardware is in constant development and includes visualization assets in the form of a headset and advanced computing. Data can be procured in various ways. The most widely used ways of acquiring raster map data currently include remote sensing of the Earth in the form of satellite data, aerial photogrammetry, or processing of historical cartographic data [1]. The typical technologies for acquiring textured 3D models of any object are automated close-range photogrammetry using hand-held cameras or drones, laser scanning, or mobile laser scanning [2,3]. The use of 360° cameras is also a specialty of documentation work; this technology allows for sure a substitute of a full 3D model of the environment for viewing in a regular web browser, as well as with the head-set option [4].
One of the first references to VR in research and scholarly interest dates back to 1991. Howard Rheingold, a cultural critic, published a book in 1991, Virtual Reality: The Revolutionary Technology of Computer-Generated Artificial Worlds—and How It Promises to Transform Society, which details the historical development of this technology. The book focused on VR development within university research groups and technology companies and among computer professionals, particularly in California’s Silicon Valley [5,6]. In 1995, Koller et al. produced probably the first publication focused on real-time 3D GIS (geographic information system). In their paper, they concentrated on virtual GIS, i.e., by linking known GIS data, such as raster layers and a digital elevation model (DEM) [7].
The research of this paper aimed to establish a connection between the evolving realms of virtual and augmented reality technologies, game engines, and the conventional outputs of geographic information systems, including digital elevation models, land cover layers, and topographic maps. Initially, an extensive investigation was conducted, focusing on identifying applications specifically tailored for visualizing GIS data.
A game engine was also used in the study by Lütjens et al., 2019, which involved creating a DEM of a portion of the Icelandic fjord inside the game engine and then visualizing it using the virtual reality and head-mounted device (HMD) HTC Vive [8].
However, the findings revealed a limited pool of applications aligned with the study’s objectives—an application that can display the area of interest and changing dataset layers, supporting virtual and augmented reality. Lot of applications were in their beta test phases and only partially fulfilled the necessary features for comprehensive GIS data visualization [9,10].

2. Materials and Methods

This section describes the existing software options and the data used.

2.1. Current State of the Art and Objectives

Among the identified applications, one closely resembling this study’s research scope was a software developed by ESRI, a renowned company in the realm of geographic information systems, widely recognized for its flagship GIS application, ArcGIS. This application primarily showcased the display of a digital elevation model integrated with a map layer or texture, indicating potential for aligning with this study’s objectives [10,11,12].
Furthermore, while exploring available platforms, it was noted that the Google Earth VR app also offers certain functionalities pertinent to this research. Specifically, it allows for general map viewing in virtual reality and facilitates immersive flythroughs of landscapes, demonstrating another avenue for integrating GIS data into a virtual environment [11].
However, despite these identified applications showcasing glimpses of alignment with this study’s intent, none fully encapsulate the comprehensive integration of GIS outputs with virtual and augmented reality technologies as envisioned in this research. The quest for a platform capable of bridging these domains to offer unified solutions for immersive GIS data visualization and interaction remains a focal point in the pursuit of this study.
VR represents a transformative computer-generated realm, immersing users within intricately crafted digital landscapes. This groundbreaking technology transcends conventional boundaries, introducing boundless opportunities for diverse applications. Its capacity to vividly visualize intricate 3D models and generate environments rooted in real-world GIS data is just the tip of the iceberg. VR extends its capabilities to create historically accurate museums and immersive entertainment experiences, redefining user interactions in digitally created spaces. The so-called virtual GIS (VGIS) and its large benefits for analysis and visualization are discussed in the big book on GIS by Longley et al., 2005 [12].
The fusion of VR with fundamental GIS data allows for unforeseen vistas, offering users an entirely fresh outlook on familiar datasets, challenging conventional perceptions of spatial information. Concurrently, augmented reality carves a significant impact across industries by overlaying digital models onto the real world through specialized glasses or devices like iPads, enhancing user experiences by integrating virtual elements into their immediate surroundings.
Both VR and AR possess unique strengths and limitations, and this study endeavors to explore and dissect these visualization technologies comprehensively. While VR elevates visualization to a higher level with its immersive experiences, it demands substantial computational power, rendering it less accessible due to stringent hardware requirements. Conversely, AR, while offering more accessibility through everyday devices like smartphones or tablets, grapples with hardware limitations, compromising its visual sophistication.
The confluence of VR and AR in GIS demonstrations presents an exciting challenge and a golden opportunity. It serves as a platform to revolutionize data visualization methodologies and craft entirely novel ways to engage with spatial information. By delving into the intricacies, advantages, and drawbacks of both technologies, this study aims to pioneer innovative visualization methods, democratizing access to immersive experiences and transforming the conventional paradigms of spatial data interpretation and utilization. The goal is to empower individuals to interact with GIS data in unprecedented and enriching ways, irrespective of technological barriers, thereby fostering a more inclusive and insightful engagement with spatial information.
The idea of using classical GIS data as raster and vector data layers was described by Baumgartinger from the University of Vienna in his 2022 study entitled “Concepts of Virtual Reality in GIS”. Among other things, the author describes the application fields for virtual reality GIS (VRGIS) [13]. The second of these modern visualization technologies—AR and its connection with GIS—is presented by Bazargani, 2022 [14]. The focus is on the depiction of technological elements such as bridges, buildings, etc. It deals with several aspects that accompany the use of AR, such as tracking, rendering, imaging devices, and other functions.
An interesting project that combines raster data and virtual reality in an application created in the Unreal game engine is also a 3D visualization of the historical river valley with the support of procedural modeling [15].
Another interesting project was the processing of surface data and its combination with the groundwater model using ArcGIS, QGIS, and its visualization by using Unity game engine for VR by Rink, K et al. [16]. A common problem in VR is to create a pleasant user interface (UI); in this project, it was solved in an interesting way, where the user looks at a video wall with projections on both sides.
Game engines are one of the best options for visualizing GIS data, as evidenced by another study by Helbig et al. [17], who linked mobile sensing data associated with buildings’ vegetation and satellite images into a virtual environment using Unity in their study. The application includes a fairly extensive UI where you can, for example, hide layers, filter data, change perspective, and more. The results of the study were presented as stereoscopic pictures.
Several authors have explored the use of VR and AR for many other purposes, such as virtual museums or, more generally [18], the visualization of objects of historical significance, which can be buildings, landscapes, or even objects. These activities increased significantly during the COVID-19 pandemic due to the inability to travel. Nowadays, they are proving to be important for reducing the carbon footprint or for people with disabilities, for example, and generally making history and information accessible without the need to travel [19,20].
The motivation for this paper was the use of innovative imaging technologies, such as VR and AR, and their integration with GIS data. As part of the Geo-harmonizer project (see Funding), which focuses on reducing problems of national data using seamless geographical data over the entire extent of the EU available through open data licenses, it was necessary to keep up with the development of visualization technologies and create a beta application that is able to leverage the datasets created with the Geo-harmonizer project and provide users with guidance on how they can create visualizations in VR and AR using available open-source technologies and open data. For our study, the land cover product for continental Europe over 20 years (2000–2019) [21] was chosen.
The main objectives of this study are to create a workflow on how GIS data can be displayed in VR and AR using game engines, creating a sample application that will use the aforementioned modern display technologies to display classic data used in GIS (such as land cover and a digital elevation model) with basic controls, such as layer switching, layer blending, and evolution in years. Another goal is to freely share the developed workflow and applications with users interested in the same topic, allowing them to visualize their own data using VR and AR with the help of this study.
In the following subsection, the two closest applications to our research are presented.

2.1.1. Esri VR/AR

Esri already provides applications that support both VR and AR, much of it 360° imagery that can be viewed using VR goggles or a smartphone. For the most part, it uses CityEngine to visualize more urban areas, bringing planned buildings up to date and more. The AR app presented at the Esri 2019 developer summit allowed for the display of the area of interest, but only with a classic texture. In 2023, Esri introduced the ArcGIS Maps SDKs for a game engines module [9,10]. A AR visualisation of the DEM model in ArcGIS is shown in Figure 1.
Esri has found that the benefits of game engines are huge, so combining their programs with a game engine is a smart option.

2.1.2. GeaVR—Visual Discovery Framework

An environment created in the Unity engine is open and allows the user to upload their datasets to this interface [22]. While the interface is full of various preset tools, such as measurements, topographic models, or querying GNSS (global navigation satellite system) coordinates, it is not possible to change layers, and most of the case studies that used this framework were taken by drones and processed photogrammetrically. The resulting model had to be created with a detailed texture to discern the surface texture. Therefore, the use of this framework is more for an individual, smaller area of interest than we want to study in detail and have detailed imaged data for [23,24]. Figure 2 shows the GeaVR framework with the 3D photogrammetric model. A similar approach was used in the article “Virtual reality based novel use case in remote sensing and GIS” from 2021 [25]. For the applications described above, the trend and benefit of using game engines as the core for visualizing and working with datasets from GIS programs is unquestionable.
In many cases, it is even more beneficial than creating separate software to support VR. Game engines and their usage make the work of developers easier.
However, none of the above-mentioned applications meet our objectives, so the trend of game engines needs to be considered and applied in this case study, and a complete workflow needs to be created to convert our selected data into a game engine to obtain the final visualization and work with the datasets. The article “Official survey data over virtual worlds” was the closest to our concept after a wide search of articles on a similar topic. This article is the only one that deals with the mentioned issue of loading different data layers and their visualization and personalized switching within an application created in a game engine with the support of virtual reality [27].

3. Data and Software Used

To achieve the objectives of this study, it was necessary to search for suitable software and GIS data sets. It was essential to process input GIS data, such as land cover, topographic maps, and a digital elevation model for a particular area. DEM and land cover (LCV) data layers were provided by the Geo-harmonizer project. DEM is based on publicly available data sources and predicted using ensemble machine learning (EML) [28,29]. Land cover layers produced by spatiotemporal ensemble machine learning framework based on LUCAS, CORINE, and GLAD Landsat data [21] are available for the years 2000 to 2019 at 30 meter resolution per pixel [30,31]. The topographic map was produced from OpenStreetMap dataset [32]. To process the input GIS data, open-source tools like QGIS, GDAL, and GRASS GIS were used.
QGIS 3.24, an open-source Geographic Information System software, serves as a comprehensive tool for managing, analyzing, and manipulating geospatial data. It excels in data handling, offering a wide range of geoprocessing tools and cartographic capabilities. QGIS enables users to import, edit, and analyze land cover datasets efficiently, providing a robust foundation for data preparation and initial analysis. Thanks to the fact that the software is open source, it allows us, if necessary, to create scripts or reprogram the functions that are needed by the objectives of study.(The Geo-harmonizer project aimed to reduce national geographic data problems by using seamless comprehensive datasets covering the entire European Union. https://opendatascience.eu/geoharmonizer-project/ accessed on 2 July 2024.)
Next, it was necessary to find suitable software to create a virtual reality environment for processing the data above. The most commonly used software for working with VR and AR are game engines. The main game engines are Unreal and Unity. Both of these engines have similar capabilities. Unreal Engine was chosen for the study because of the authors’ experience working in it. However, it can be assumed that the same result could be achieved in Unity.
Unreal Engine stands as a leading real-time 3D creation platform renowned for its advanced rendering capabilities and versatile toolset. While primarily known for game development, Unreal Engine’s capabilities extend far beyond gaming. Its powerful graphics rendering, physics engine, and cross-platform support make it an ideal environment for visualizing GIS data in immersive virtual worlds.
The integration of QGIS for data preparation, analysis, and initial visualization, followed by Unreal Engine for creating immersive, real-time, and visually stunning representations, offers a holistic approach to leveraging GIS land cover datasets. QGIS enables organization and analysis of the data, preparing it for Unreal Engine’s utilization. Unreal Engine, with its advanced rendering and interactive capabilities, allows for the creation of immersive experiences, enabling stakeholders to explore and understand land cover data in a visually compelling and interactive manner.
Together, these software tools provide a comprehensive pipeline for handling GIS land cover datasets, from data preparation and analysis in QGIS to immersive visualization and interactive experiences in Unreal Engine. This integration empowers users to gain deeper insights, make informed decisions, and communicate complex spatial information effectively through visually engaging and interactive presentations.
This research used common but well-configured computers. To develop the application using UE, a more powerful workstation was needed, especially with a high-tech graphics card, in this case, the chipset RTX3090.

4. Main Research

The initial phase of this research posed challenges, as evident from the preceding section, due to the limited resources available to address the identified issue. Developing an innovative and comprehensive solution necessitates the careful selection of essential elements from various sources and integrating these outputs into a unified framework. Establishing priority features for both VR and AR applications marked the outset of this research.
The key features required for these applications include the following:
  • Ability to designate the area of interest;
  • Switching between different land cover data periods;
  • Seamless blending with the map during viewing;
  • Facilitating comparison between land cover data from different years.
The selection of suitable formats for data visualization is pivotal. With an array of data formats available, each offering distinct advantages, identifying the most suitable formats for this study became crucial. For the development of a prototype application demonstrating the viability of using VR and AR technologies for GIS visualization, primary map coverage was derived from DEM, LCV, and OpenStreetMap (OSM) datasets. A sample of the datasets used is shown in Figure 3. Raster data was chosen for this project and the required outputs. Vector data can of course also be used within game engines, but it is more suitable for generating 1:1 landscapes, including vegetation. This topic, including the procedural modeling of vegetation and landscape features, is addressed in the paper by Janovsky et al. [15].
The Netherlands and the Czech Republic were selected as the area of interest for this research. The aim is to show the potential and feasibility of selected GIS visualization technologies. These two countries were selected on the basis of the Geo-harmonizer consortium agreement. We focused on two relatively small European countries with different land cover.
Additionally, it is presumed that reading GeoTIFF, which stands as the well-known raster data format in GIS, might prove unfeasible within the game engine. Thus, it becomes imperative to establish an environment within the Unreal Engine that operates independently of this specific information. Subsequent verification has indeed confirmed this assumption. Although the Unreal Engine is capable of handling DEM-type data, specifically, elevation maps, a notable issue arises regarding its utilization. While the engine can process this data for constructing the environment—primarily through the landscape function—it presents a challenge when attempting to utilize it solely on individual tiles or in a scaled-down format.
The basic question was how to display a large territory without the possibility to use the classical pyramid data loading algorithm, which the game engine does not include. Implementing such a sophisticated algorithm would clearly be difficult, even for professional programmers.
Therefore, it was necessary to create a workflow that would allow us to break the area of interest into smaller units and to model the DEM as a 3D object for easier integration into the game engine and then load it into Unreal Engine.
The area of interest was subdivided into tiles to create a tile map of the area of interest. Each tile was 30 × 30 km in size.
A DEM model in GeoTIFF format was generated for each tile. However, the GeoTIF format still needed to be converted to 3D for the game engine. This was achieved using the Three.js library. Figure 4 and Figure 5 show map tilling and the generation of the DEM model for tile.

4.1. Three.js

Three.js is a widely adopted JavaScript library utilized for generating 3D graphics within web browsers and has been explored in conjunction with QGIS to elevate its 3D visualization functionalities significantly [33].
Nonetheless, users and developers have undertaken experiments using custom scripts or extensions to harness the potential of Three.js within the QGIS environment. These initiatives are aimed at enriching QGIS’s 3D visualization capabilities by leveraging Three.js’s powerful features, enabling the creation of more dynamic and interactive 3D presentations through a plugin called Qgis2threejs [34].
The Qgis2three.js plugin allows you to generate a 3D model from a DEM with various settings. The suitable settings of all the parameters were tested. Based on the experiments, a customized script was developed by the authors to automate the whole process of the generation of 3D objects from the input DEM data layers. The script uses QGIS API to load an input TIFF file to a QGIS project. In the next step, the loaded raster data layer is converted using Qgis2three.js API (ModelExporter class) to a 3D object in glTF format. An example of the basic conversion of a DEM to a 3D model is shown in Figure 6.
The glTF format was chosen because it is highly efficient and versatile, making it ideal for various 3D applications, like web-based experiences, AR/VR, and gaming. It is widely supported across platforms, ensuring a seamless exchange of assets between different workflows.
The source code, including the used settings, is available from the Git repository [29]. From the user’s perspective, the script is integrated as a macro, which is launched automatically when opening a dedicated QGIS project. The macro processes all the TIFF files located in the directory defined by the GLTF_INPUT environment variable.
After creating the 3D objects, the remaining GIS data were processed, i.e., the necessary data were generated for each tile, which will serve as the textures in the game engine. For each tile, OpenStreetMap and land cover textures were created at the same time. In game engines, working with models is different than GIS programs.
For game engines, it is worth working with 3D models (created through a plugin in QGIS) and textures; it is the foundation of how the game engines operate. This case study aimed to demonstrate that even without utilizing any special converters or plugins into the game engine, it is possible to visualize GIS data and work with it.
The complete workflow is described in the following subsection.

4.2. Preprocessing GIS Data

The workflow for preprocessing input GIS data used for the purpose of VR and AR applications is shown in Figure 7. The computational parts were automated in the form of virtual environments built with Docker technology. This technology was chosen to increase the reproducibility of the entire calculation process without the need to install individual software components and their settings on a given operating system. To process a different user-selected area of interest, it is sufficient to use pre-prepared Docker containers. Different software components are used in each calculation step. For this reason, a separate Docker container was also defined for each computational step.
In the first step, the input data is retrieved from the Geo-harmonizer metadata catalogue. This is DEM and LCV layers in COG (cloud-optimized GeoTIFF) data format. The process of finding and downloading the data is provided by the eumap library [28], which is integrated in the Docker container “download_input_data”. The input data are obtained in the ETRS89-extended/LAEA Europe coordinate system (EPSG 3035) with a spatial resolution of 30 m, which is used further throughout the processing workflow.
The area of interest is selected by the user interactively in the QGIS application environment based on a predefined tiling system. In the next two steps, DEM, LCV, and OSM (open-street map) tiles are generated in GeoTIFF format (https://data.opendatascience.eu/geonetwork, accessed on 2 July 2024.)
The DEM and LCV tiles are generated based on the static input data downloaded in the first step. The computation of the tiles is performed by the GRASS GIS 3.24 software integrated in the “make_img_tiles” Docker container. The OSM tiles are obtained dynamically from a locally running web server based on a customized MapServer Basemap project, providing OSM tiles in the EPSG 3035 coordinate system. The OSM basemaps are provided by a web server running in an isolated Docker container “web_server environment”. OSM tile generation is performed by the shp2img tool integrated in the “make_map_tiles” Docker container.
Next, the generated tiles are customized for VR and AR applications using the convert tool integrated in the Docker container “flip_tiles”. The last step, converting DEM tiles to 3D objects (glTF data format), is automated in the form of a macro that runs when opening a dedicated QGIS project directly in the QGIS application environment. The QGIS macro developed in the Python programming language is based on the Qgis2threejs plugin.

4.3. VR Application

After preparing and refining the data, it was possible to start creating VR in Unreal Engine. The application was developed in Unreal Engine version 4.27. During the initial tests of this project, Unreal Engine 5 was still in beta, so the work was conducted in the latest stable version of Unreal Engine.
The first step was to create a 30 × 30 m hall within the game engine, which served as a central space for user navigation. Within this hall, five basic stations were strategically placed to engage users according to their specific needs and interactions. These stations, namely, Main Menu (Figure 8), Overview, Visualization, Comparison, and Animation, were placed throughout the space to seamlessly interact with users based on their navigational requirements. A view of all stations is shown in Figure 9. Each station has been carefully designed to present different functionality. This need arose after testing, where the assembly of all functions into one larger station was rather cluttered. Each station’s distinct functions and strategic placement within the virtual hall aim to optimize user interaction and comprehension, ensuring a seamless and informative experience as users navigate through the VR environment.
The user moves in the virtual space using a teleport, and some objects can be picked up, for example, the legend to the land cover data. The user controls the selection of territory or the interaction at the habitats with a laser pointer. The app is compatible for all goggles that support steamVR, although the app has been tested primarily on the HTC Vive Pro, Oculus Rift S, and HTC Focus 3.
For the main menu station, an interactive map of the territory of interest, i.e., the Netherlands and the Czech Republic, was created. The map is divided by tiles in the same way it was divided in the data preparation phase. The user selects the territory of interest, and the selected tile is automatically displayed on all the other stations. There is the Overview station directly between the maps, where the 3D model of the tile is displayed with the texture of the map base, so that the user can immediately see if it is the territory he wants to analyze on other stations.
Another station is Visualization. Here, the user is shown the area of interest on two tiles, one slightly beveled to show its plasticity so the user can see the texture of the tile better, whether it is mountainous or hilly or whether there are plains in the narrows. An in-game example of a visualization station with sliders is shown in Figure 10. On the horizontal tile, again, the whole area is more visible. The user can interact using two sliders. One controls the translucency of the layers. The other slider is used to control the land cover data layers, and years can be switched here. Thanks to the blending function, it is possible to blend both textures at the same time, so the user can have the map base under the land cover data texture set to, for example, 25%. Thus, he can see where he is in the area of interest and use the slider with the years to track changes in the landscape. Additionally, there exists a board featuring a legend, positioned either affixed to the wall or presented as an interactive list that may be grasped and manipulated by the user.
The comparison station is another one that the user can visit within the application. Here, there are two tiles side by side, and each can be controlled by a different slider so the user can compare in detail different years of land cover data on the selected tile, development and growth of the city, decrease in forests, and more. Again, the legend stands are included.
The last station located in the area is a station called Animation. As the name implies, this is an animation that starts when the user enters it. The animation displays all the years chronographically in sequence and flips automatically through the LC years after 4 s. The user can thus see the growth of cities, forest loss, or the expansion of fields on the dynamic animation.

4.4. Blueprints in Unreal Engine

The Blueprint visual scripting system integrated into Unreal Engine is a comprehensive set of game scripting tools that uses a node-based interface directly in Unreal Engine. It is therefore, theoretically, a programming language based on visual scripting using so-called noodle graphs.
The system provides the user with great flexibility. One of the great advantages is the possibility for a user not familiar with a classical programming language (e.g., C++) to enter and customize the programmed functions [35]. The following paragraphs will present the solution of all the mentioned stations in the virtual world, which was created with the help of Blueprints.
The basic blueprint for the main menu is called BP_mainMenuMap, and it contains the MainMenuMapWidgetArray widget (Figure 11) itself. The programming of the main function of displaying the selected object (spawn actor) was completed first. This function ensures that when clicked, the selected array is displayed. At the same time, it is set here that if the selection changes, i.e., the user selects another area of interest, the existing object is destroyed (DestroyActor).
A slider is used to switch layers (Figure 12); using this widget, it is possible to switch individual layers of the land cover type for the area of interest or use a second slider that allows the opacity level to blend with the OpenStreetMap coverage. The slider operates on the material created that is specific to each area of interest. The material contains all the land cover and OpenStreetMap textures. Figure 13. shows the complete slider and main menu widget graphs.
All the data and this UE project are open source and can be downloaded in the GitLab repository (https://gitlab.com/geoharmonizer_inea/vr-ar-app-prototype, accessed on 2 July 2024).

4.5. AR Application

Despite virtual and augmented reality being different technologies, a part of the 3D data preparation process is still similar. The 3D models created as part of the preparation for virtual reality, including the textures created, were also used for augmented reality. Another thing that was used from the virtual reality application is the main menu, which operates on the same principle of selecting the area of interest.
As is well known and was mentioned in the introduction of the study, AR brings certain limitations with it. The biggest difficulty is limited performance. Thus, it is not possible to work with such detailed data as in the case of VR, which is usually supplied by a powerful graphics workstation.
Another limitation is, of course, the anchoring of the object. If we use only the lens of the smartphone or tablet to anchor the object, the 3D model is unstable, and it can move or bounce. Apple has come up with a big improvement for object anchors by adding a LiDAR sensor to its Pro devices that eliminates these problems and thus frees the AR 3D model from the anchor problem. The 3D model is stable thanks to the LiDAR sensor anchoring, even if the user completely leaves the position with the device and comes back [36].
The requirements for the AR features were essentially the same as for the VR app, except for the tile matching requirement. The feature requirements were therefore as follows: (1) ability to designate the area of interest, (2) switching between different land cover data periods, (3) seamless blending with the map during viewing. Instead of the discarded comparison function, an animation mode was added to the functions. Figure 14. shows a screenshot of the application.
The AR application was developed using the Swift programming language in conjunction with Xcode. During development, the SwiftUI framework was primarily used for the user interface, while some components were created using UIKit. A seamless combination of RealityKit and ARKit frameworks was used to run AR.
The app is specifically designed for iPads equipped with a spacious 12.9-inch display, and it is optimized to provide an immersive and visually rich experience. However, due to the app being designed for larger screens, its experience may not be as optimal on smaller displays.
In terms of compatibility, the app ensures availability across a variety of iOS devices (iPad) and extends its support from iOS 12.1 and beyond.
Depending on the limited hardware performance of the possible devices used and to optimize the fluidity of the application, it was necessary to change the solution of an arbitrary percentage of the blending of the OpenStreetMap substrate maps with the land cover texture. The best possible solution was the preparation of pre-made textures blending in different percentage overlays. Thus, textures with 25%, 50%, and 75% blending were automatically generated for each land cover data layer. The 0% blending was the classic land cover background and 100% was the map generated from the OpenStreetMap background. The user thus has the choice between these values. Thanks to this modification, the application can respond immediately to the user’s wishes, and this feature is smooth. This application can be downloaded from the App Store (OpenDataScience AR), and requires an iPad OS 14.3 or later (https://apps.apple.com/us/app/opendatascience-ar/id1599107416, accessed on 2 July 2024).

5. Discussion

The main question of this study was how to visualize classic GIS outputs using modern virtual and augmented reality technologies. Another challenge was to ensure that the selected area of interest could be fully implemented in the game engine. Both issues have been elaborated in this study, and the solution and how the desired results can be achieved are presented. In this project and research, we want to make open datasets available in a modern way and demonstrate how to work with them using the example of land cover products. Especially for the younger generation, this makes sense and increases the potential for data usability. Large territories can be loaded and processed in a game engine using tiling [15], but based on research, this 1:1 view is not effective, especially using lower quality information datasets, such as the landcover currently used. On scaled DEM models, such as those used in this study in both VR and AR, individual textures can be effectively scaled and blended with the map base using the level of overlay. Similarly, individual textures can be changed efficiently and blended with the map background using the opacity level. The developed techniques are functional and can be applied to different datasets. This study also includes a link to a Git repository where interested parties can freely download the tested and developed projects and apply them for their research. The shortcomings are also worth mentioning. One drawback is the hardware, which is still not common for VR. This is why currently, a lot of applications are made for AR, where a smartphone, tablet, or laptop is sufficient. However, not everything can be visualized this way. There are still differences in VR and AR, which is why mixed reality XR is used, for example. The newly released HMD Apple Vision Pro, which already provides some basic demos, can be a great help and change for displaying GIS data (Figure 15). Looking to the future, we can expect XR in this form to become more and more widely used, as it combines the best of both technologies, AR connectivity to the real environment, and VR quality data display.
This project is one of the first in such an area, so it is likely that more ideal and effective solutions may emerge over time. Software development is rapid; therefore, it is possible that in the near future, tiling will be unnecessary and large areas can be worked on.

6. Conclusions

The presented study offers a modern perspective on classical GIS data and their implementation in VR and AR environments. These visualization technologies are rapidly growing trends, with increasing popularity for visualizing various types of data. This study introduces an innovative approach to utilizing well-known GIS datasets (DEM, Landcover, and OpenStreetMap) in modern gaming engine environments, such as Unreal (VR), Xcode, and SwiftUI framework (AR), providing examples of possible outputs using modern visualization technologies. This study shows the possible interconnection with VR and AR and the benefits that arise from this interconnection, such as the visualization of geospatial data in a virtual world, where the user is literally drawn into plastic DEM models that are covered with different types of textures. This study also includes a description of the tiling method as an option for displaying large-scale models.
In this project, GIS data were prepared and processed using the open-source software QGIS 3.24. The workflow demonstrates that there are many technical issues to be addressed in the conversion and integration of data. The VR application developed in the study demonstrates how data in a virtual world can appear. The application also showcases possibilities for users to manipulate data in the virtual world using VR glasses and controllers. Additionally, an AR application is introduced, which, compared to VR, has some limitations but appears to be more user-friendly, as it only requires an iOS tablet for use, in contrast to the hardware required for VR. Pre-built VR and AR projects are freely available to users; they only need to follow the procedure published in the Git repository (VR/AR App Prototype Git repository) to process user-defined areas of interest. It is possible that the complexity of the described procedure in this study will decrease with the development of hardware and the expansion of software functions.

Author Contributions

Conceptualization, Karel Pavelka, Jr. and Martin Landa; methodology, Karel Pavelka, Jr. and Martin Landa; software, Karel Pavelka, Jr.; scripts, Martin Landa; validation, Karel Pavelka, Jr. and Martin Landa; formal analysis, Karel Pavelka, Jr. and Martin Landa; writing—original draft preparation, Karel Pavelka, Jr. and Martin Landa; writing—review and editing, Karel Pavelka, Jr. and Martin Landa; visualization, Karel Pavelka, Jr. and Martin Landa; project administration, Karel Pavelka, Jr.; funding acquisition, Karel Pavelka, Jr. and Martin Landa. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by CEF Telecom Geo-harmonizer project 2018-EU-IA-0095, was co-financed by the European Health and Digital Executive Agency (HaDEA), and partially funded by an internal grant of the CTU in Prague, SGS 2024-SGS24/050/OHK1/1T/11 (Pavelka).

Data Availability Statement

The data and scripts produced by the Geoharmonizer project are freely available from the sources listed on the website. Homepage of the Geoharmonizer project-https://opendatascience.eu/.

Conflicts of Interest

The authors declare no conflict of interests.

References

  1. Manson, S.; Bonsal, D.; Kernik, M.; Lambin, E. Geographic Information Systems and Remote Sensing. In International Encyclopedia of the Social & Behavioral Sciences, 2nd ed.; Wright, J.D., Ed.; Elsevier Inc.: St. Frisco, CO, USA, 2015; pp. 64–68. [Google Scholar] [CrossRef]
  2. Pavelka, K.; Šedina, J.; Matoušková, E. High Resolution Drone Surveying of the Pista Geoglyph in Palpa, Peru. Geosciences 2018, 8, 479. [Google Scholar] [CrossRef]
  3. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV Photogrammetry for Mapping and 3D Modeling—Current Status and Future Perspectives. In Proceedings of the International Conference on Unmanned Aerial Vehicle in Geomatics (UAV-g), Zurich, Switzerland, 14–16 September 2011. [Google Scholar] [CrossRef]
  4. Shults, R.; Levin, E.; Aukazhiyeva, Z.; Pavelka, K.; Kulichenko, N.; Kalabaev, N.; Sagyndyk, M.; Akhmetova, N. A Study of the Accuracy of a 3D Indoor Camera for Industrial Archaeology Applications. Heritage 2023, 6, 6240–6267. [Google Scholar] [CrossRef]
  5. Rheingold, H. Virtual Reality: The Revolutionary Technology of Computer-Generated Artificial Worlds—And How It Promises to Transform Society; Simon & Schuster: London, UK, 1992; ISBN 0-671-77897-8. [Google Scholar]
  6. Chan, M. Virtual Reality: Representations in Contemporary Media; Bloomsbury Academic: London, UK, 2014. [Google Scholar] [CrossRef]
  7. Koller, D.; Lindstrom, P.; Ribarsky, W.; Hodges, L.F.; Faust, N.; Turner, G. Virtual GIS: A real-time 3D geographic information system. In Proceedings of the Visualization ’95. Visualization ’95, Atlanta, GA, USA, 29 October–3 November 1995. [Google Scholar] [CrossRef]
  8. Lütjens, M.; Kersten, T.P.; Dorschel, B.; Tschirschwitz, F. Virtual Reality in Cartography: Immersive 3D Visualization of the Arctic Clyde Inlet (Canada) Using Digital Elevation Models and Bathymetric Data. Multimodal Technol. Interact. 2019, 3, 9. [Google Scholar] [CrossRef]
  9. Wittner, E.; Hansen, R.; Fabriciu, T. VR and AR in ArcGIS: An Introduction. 2019. Available online: https://www.esri.com/content/dam/esrisites/en-us/about/events/media/UC-2019/technical-workshops/tw-6064-1136.pdf (accessed on 22 January 2024).
  10. Mueller, P.; Wittner, E.; Hansen, R.; Meriaux, A. Virtual Reality (VR) and Augmented Reality (AR) with ArcGIS. 2019. Available online: https://mediaspace.esri.com/channel/2019+Esri+Developer+Summit/244792152 (accessed on 22 January 2024).
  11. Google Earth. Available online: https://earth.google.com (accessed on 22 January 2024).
  12. Longley, P. Geographical Information Systems: Principles, Techniques, Management and Applications, 2nd ed.; Longley, P.A., Goodchild, M.F., Maguire, D.J., Rhind, D.W., Eds.; Wiley Publishing: Hoboken, NJ, USA, 2005; ISBN 978-0-471-73545-8. [Google Scholar]
  13. Baumgartinger, M. Concepts of Virtual Reality in GIS. Bachelor’s Thesis, University of Vienna, Vienna, Austria, 2020. [Google Scholar] [CrossRef]
  14. Bazargani, J.S.; Zafari, M.; Sadeghi-Niaraki, A.; Choi, S.-M. A Survey of GIS and AR Integration: Applications. Sustainability 2022, 14, 10134. [Google Scholar] [CrossRef]
  15. Janovský, M.; Tobiáš, P.; Cehák, V. 3D Visualisation of the Historic Pre-Dam Vltava River Valley—Procedural and CAD Modelling, Online Publishing and Virtual Reality. ISPRS Int. J. Geo Inf. 2022, 11, 376. [Google Scholar] [CrossRef]
  16. Rink, K.; Nixdorf, E.; Zhou, C.; Hillmann, M.; Bilke, L. A Virtual Geographic Environment for Multi-Compartment Water and Solute Dynamics in Large Catchments. J. Hydrol. 2020, 582, 124507. [Google Scholar] [CrossRef]
  17. Helbig, C.; Becker, A.M.; Mason, T.; Mohamdeen, A.; Sen, Ö.O.; Schlink, U. A game engine-based application for visualizing and analysing environmental spatiotemporal mobile sensor data in an urban context. Front. Environ. Sci. 2022, 10, 952725. [Google Scholar] [CrossRef]
  18. Pavelka, K., Jr.; Raeva, P. Virtual museums—The future of historical monuments documentation and visualization. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W15, 903–908. [Google Scholar] [CrossRef]
  19. Pavelka, K., Jr.; Pacina, J. Using of modern technologies for visualization of cultural heritage. Civ. Eng. J. 2023, 32, 549–563. [Google Scholar] [CrossRef]
  20. Lee, H.; Jung, T.H.; tom Dieck, M.C.; Chung, N. Experiencing Immersive Virtual Reality in Museums. Inf. Manag. 2020, 57, 103229. [Google Scholar] [CrossRef]
  21. Witjes, M.; Parente, L.; van Diemen, C.J.; Hengl, T.; Landa, M.; Brodský, L.; Halounová, L.; Križan, J.; Antonić, L.; Ilie, C.M.; et al. A spatiotemporal ensemble machine learning framework for generating land use/land cover time-series maps for Europe (2000–2019) based on LUCAS, CORINE and GLAD Landsat. PeerJ 2022, 10, e13573. [Google Scholar] [CrossRef] [PubMed]
  22. GeaVR|Visual Discovery Framework Homepage. Available online: https://www.geavr.eu (accessed on 22 January 2024).
  23. Tibaldi, A.; Bonali, F.L.; Vitello, F.; Delage, E.; Nomikou, P.; Antoniou, V.; Becciani, U.; de Vries, B.V.W.; Krokos, M.; Whitworth, M. Real world–based immersive Virtual Reality for research, teaching and communication in volcanology. Bull. Volcanol. 2020, 82, 38. [Google Scholar] [CrossRef]
  24. Antoniou, V.; Bonali, F.L.; Nomikou, P.; Tibaldi, A.; Melissinos, P.; Mariotto, F.P.; Vitello, F.R.; Krokos, M.; Whitworth, M. Integrating Virtual Reality and GIS Tools for Geological Mapping, Data Collection and Analysis: An Example from the Metaxa Mine, Santorini (Greece). Appl. Sci. 2020, 10, 8317. [Google Scholar] [CrossRef]
  25. Singla, J.G. Virtual reality based novel use case in remote sensing and GIS. Curr. Sci. 2021, 121, 958. [Google Scholar] [CrossRef]
  26. Vitello, F.R. GeaVR—Navigation Modes. 2020. Available online: https://www.youtube.com/watch?v=lfqusHa5xIk (accessed on 22 January 2024).
  27. Höhl, W. Official Survey Data and Virtual Worlds—Designing an Integrative and Economical Open-Source Production Pipeline for xR-Applications in Small and Medium-Sized Enterprises. Big Data Cogn. Comput. 2020, 4, 26. [Google Scholar] [CrossRef]
  28. EUMAP Library Homepage. Available online: https://eumap.readthedocs.io/en/latest/ (accessed on 26 December 2023).
  29. VR/AR App Prototype Git Repository. Available online: https://gitlab.com/geoharmonizer_inea/vr-ar-app-prototype (accessed on 26 December 2023).
  30. Parente, L.; Witjes, M.; Hengl, T.; Landa, M.; Brodsky, L. The continental Europe land cover mapping at 30m resolution-based CORINE and LUCAS on samples (v0.1). Zenodo 2021. [Google Scholar]
  31. Hengl, T.; Parente, L.; Krizan, J.; Bonannella, C. Continental Europe Digital Terrain Model at 30 m resolution based on GEDI, ICESat-2, AW3D, GLO-30, EUDEM, MERIT DEM and background layers (v0.3). Zenodo 2020. [Google Scholar] [CrossRef]
  32. OpenStreetMap Contributors. Planet Dump. 2022. Available online: https://planet.openstreetmap.org (accessed on 1 April 2022).
  33. Three.js JavaScript 3D Library. Available online: https://threejs.org/ (accessed on 7 April 2024).
  34. Qgis2threejs QGIS Plugin. Available online: https://plugins.qgis.org/plugins/Qgis2threejs/ (accessed on 7 April 2024).
  35. Introduction to Blueprints. Available online: https://docs.unrealengine.com/4.27/en-US/ProgrammingAndScripting/Blueprints/GettingStarted/ (accessed on 7 April 2024).
  36. Next Generation AR and How the Arrival of LiDAR Presents New Creative Opportunities. 2020. Available online: https://lbbonline.com/news/next-generation-ar-and-how-the-arrival-of-lidar-presents-new-creative-opportunities (accessed on 7 April 2024).
  37. Apple Newsroom. Apple Vision Pro Brings a New Era of Spatial Computing to Business. 2024. Available online: https://www.apple.com/newsroom/2024/04/apple-vision-pro-brings-a-new-era-of-spatial-computing-to-business (accessed on 31 May 2024).
Figure 1. Esri AR application DEM (digital elevation model) with map texture in augmented reality (Esri Developer summit [10]).
Figure 1. Esri AR application DEM (digital elevation model) with map texture in augmented reality (Esri Developer summit [10]).
Ijgi 13 00241 g001
Figure 2. GeaVR framework in use—navigation/flyover [26].
Figure 2. GeaVR framework in use—navigation/flyover [26].
Ijgi 13 00241 g002
Figure 3. Used data sets (land cover, OpenStreetMap, DEM). Source: authors.
Figure 3. Used data sets (land cover, OpenStreetMap, DEM). Source: authors.
Ijgi 13 00241 g003
Figure 4. Map tiling (Czech Republic). Source: authors.
Figure 4. Map tiling (Czech Republic). Source: authors.
Ijgi 13 00241 g004
Figure 5. Map tiling—DEM generation (Czech Republic). Source: authors.
Figure 5. Map tiling—DEM generation (Czech Republic). Source: authors.
Ijgi 13 00241 g005
Figure 6. An example of DEM GIS data layer; 3D model of DEM (right). Source: authors.
Figure 6. An example of DEM GIS data layer; 3D model of DEM (right). Source: authors.
Ijgi 13 00241 g006
Figure 7. GIS data preprocessing workflow.
Figure 7. GIS data preprocessing workflow.
Ijgi 13 00241 g007
Figure 8. Main menu station in Unreal Engine. Source: authors.
Figure 8. Main menu station in Unreal Engine. Source: authors.
Ijgi 13 00241 g008
Figure 9. View of entire stations in Unreal Engine. Source: authors.
Figure 9. View of entire stations in Unreal Engine. Source: authors.
Ijgi 13 00241 g009
Figure 10. Visualization station with sliders (opacity level and year). Source: authors.
Figure 10. Visualization station with sliders (opacity level and year). Source: authors.
Ijgi 13 00241 g010
Figure 11. Widget of main menu. Source: authors.
Figure 11. Widget of main menu. Source: authors.
Ijgi 13 00241 g011
Figure 12. Widget slider. Source: authors.
Figure 12. Widget slider. Source: authors.
Ijgi 13 00241 g012
Figure 13. Example of “noodle” graphs—Widget slider (top) and main menu widget (bottom). Source: authors.
Figure 13. Example of “noodle” graphs—Widget slider (top) and main menu widget (bottom). Source: authors.
Ijgi 13 00241 g013
Figure 14. Augmented reality app. Source: authors.
Figure 14. Augmented reality app. Source: authors.
Ijgi 13 00241 g014
Figure 15. Apple Vision Pro—HMD XR [37].
Figure 15. Apple Vision Pro—HMD XR [37].
Ijgi 13 00241 g015
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pavelka, K., Jr.; Landa, M. Using Virtual and Augmented Reality with GIS Data. ISPRS Int. J. Geo-Inf. 2024, 13, 241. https://doi.org/10.3390/ijgi13070241

AMA Style

Pavelka K Jr., Landa M. Using Virtual and Augmented Reality with GIS Data. ISPRS International Journal of Geo-Information. 2024; 13(7):241. https://doi.org/10.3390/ijgi13070241

Chicago/Turabian Style

Pavelka, Karel, Jr., and Martin Landa. 2024. "Using Virtual and Augmented Reality with GIS Data" ISPRS International Journal of Geo-Information 13, no. 7: 241. https://doi.org/10.3390/ijgi13070241

APA Style

Pavelka, K., Jr., & Landa, M. (2024). Using Virtual and Augmented Reality with GIS Data. ISPRS International Journal of Geo-Information, 13(7), 241. https://doi.org/10.3390/ijgi13070241

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop