Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions
Abstract
:1. Introduction
2. Related Works in the Literature
2.1. Visual Positioning in Outdoor LAR
2.2. VSLAM and GSV in AR Applications
2.3. AR Performance and Experience Measurement
3. LAR System for Experiments
3.1. Overview of VSLAM and GSV Integration Approach
3.2. Implementation of LAR System
4. Experiment Design and Implementation
4.1. Experimental Design
4.2. Hardware Setup for Data Collection
4.3. Testing Locations and Environments
4.3.1. Definitions of Variables
4.3.2. Independent Variable Selection
4.4. Data Collection and Processing
4.4.1. Error Distance Calculation
4.4.2. Horizontal Surface Tracking Time Calculation
5. Experiment Results and Analysis
5.1. Comparative Analysis Result
5.2. Minimum Illumination Threshold
5.3. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Brata, K.C.; Liang, D. Comparative study of user experience on mobile pedestrian navigation between digital map interface and location-based augmented reality. Int. J. Electr. Comput. Eng. 2020, 10, 2037. [Google Scholar] [CrossRef]
- Asraf, S.M.H.; Hashim, A.F.M.; Idrus, S.Z.S. Mobile application outdoor navigation using location-based augmented reality (AR). J. Phys. Conf. Ser.. 2020, 1529, 022098. [Google Scholar] [CrossRef]
- Sasaki, R.; Yamamoto, K. A sightseeing support system using augmented reality and pictograms within urban tourist areas in Japan. ISPRS Int. J. Geo-Inf. 2019, 8, 381. [Google Scholar] [CrossRef]
- Santos, C.; Araújo, T.; Morais, J.; Meiguins, B. Hybrid approach using sensors, GPS and vision based tracking to improve the registration in mobile augmented reality applications. Int. J. Multimed. Ubiquitous Eng. 2017, 12, 117–130. [Google Scholar] [CrossRef]
- Siegele, D.; Di Staso, U.; Piovano, M.; Marcher, C.; Matt, D.T. State of the art of non-vision-based localization technologies for AR in facility management. In Augmented Reality, Virtual Reality, and Computer Graphics: Proceedings of the 7th International Conference, AVR 2020, Lecce, Italy, 7–10 September 2020; Proceedings, Part I; Springer: Berlin/Heidelberg, Germany, 2020; pp. 255–272. [Google Scholar] [CrossRef]
- Uradziński, M.; Bakuła, M. Assessment of static positioning accuracy using low-cost smartphone GPS devices for geodetic survey points’ determination and monitoring. Appl. Sci. 2020, 10, 5308. [Google Scholar] [CrossRef]
- Brata, K.C.; Liang, D. An effective approach to developing location-based augmented reality information support. Int. J. Electr. Comput. Eng. 2019, 9, 3060. [Google Scholar] [CrossRef]
- Azuma, R.; Billinghurst, M.; Klinker, G. Special section on mobile augmented reality. Comput. Graph. 2011, 35, vii–viii. [Google Scholar] [CrossRef]
- Al-Zoube, M.A. Efficient vision-based multi-target augmented reality in the browser. Multimed. Tools Appl. 2022, 81, 14303–14320. [Google Scholar] [CrossRef]
- Sharafutdinov, D.; Griguletskii, M.; Kopanev, P.; Kurenkov, M.; Ferrer, G.; Burkov, A.; Gonnochenko, A.; Tsetserukou, D. Comparison of modern open-source visual SLAM approaches. J. Intell. Robot. Syst. 2023, 107, 43. [Google Scholar] [CrossRef]
- Zhou, X.; Sun, Z.; Xue, C.; Lin, Y.; Zhang, J. Mobile AR tourist attraction guide system design based on image recognition and user behavior. In Intelligent Human Systems Integration 2019: Proceedings of the 2nd International Conference on Intelligent Human Systems Integration (IHSI 2019): Integrating People and Intelligent Systems, San Diego, CA, USA, 7–10 February 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 858–863. [Google Scholar] [CrossRef]
- ARCore—Google Developers. Available online: https://developers.google.com/ar (accessed on 11 January 2023).
- Baker, L.; Ventura, J.; Langlotz, T.; Gul, S.; Mills, S.; Zollmann, S. Localization and tracking of stationary users for augmented reality. Vis. Comput. 2024, 40, 227–244. [Google Scholar] [CrossRef]
- He, J.; Li, M.; Wang, Y.; Wang, H. OVD-SLAM: An online visual SLAM for dynamic environments. IEEE Sens. J. 2023, 23, 13210–13219. [Google Scholar] [CrossRef]
- Liu, H.; Zhao, L.; Peng, Z.; Xie, W.; Jiang, M.; Zha, H.; Bao, H.; Zhang, G. A Low-cost and Scalable Framework to Build Large-Scale Localization Benchmark for Augmented Reality. IEEE Trans. Circuits Syst. Video Technol. 2023, 34, 2274–2288. [Google Scholar] [CrossRef]
- Reljić, V.; Milenković, I.; Dudić, S.; Šulc, J.; Bajči, B. Augmented reality applications in industry 4.0 environment. Appl. Sci. 2021, 11, 5592. [Google Scholar] [CrossRef]
- Kiss-Illés, D.; Barrado, C.; Salamí, E. GPS-SLAM: An augmentation of the ORB-SLAM algorithm. Sensors 2019, 19, 4973. [Google Scholar] [CrossRef]
- Tourani, A.; Bavle, H.; Sanchez-Lopez, J.L.; Voos, H. Visual SLAM: What are the current trends and what to expect? Sensors 2022, 22, 9297. [Google Scholar] [CrossRef]
- Fernández, L.; Payá, L.; Reinoso, O.; Jiménez, L.; Ballesta, M. A study of visual descriptors for outdoor navigation using google street view images. J. Sens. 2016, 2016. [Google Scholar] [CrossRef]
- Wang, J.; Wang, Q.; Saeed, U. A visual-GPS fusion based outdoor augmented reality method. In Proceedings of the 16th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, Tokyo, Japan, 2–3 December 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Huang, K.; Wang, C.; Shi, W. Accurate and Robust Rotation-Invariant Estimation for High-Precision Outdoor AR Geo-Registration. Remote Sens. 2023, 15, 3709. [Google Scholar] [CrossRef]
- Jinyu, L.; Bangbang, Y.; Danpeng, C.; Nan, W.; Guofeng, Z.; Hujun, B. Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality. Virtual Real. Intell. Hardw. 2019, 1, 386–410. [Google Scholar] [CrossRef]
- Xu, L.; Feng, C.; Kamat, V.R.; Menassa, C.C. An occupancy grid mapping enhanced visual SLAM for real-time locating applications in indoor GPS-denied environments. Autom. Constr. 2019, 104, 230–245. [Google Scholar] [CrossRef]
- Sumikura, S.; Shibuya, M.; Sakurada, K. OpenVSLAM: A versatile visual SLAM framework. In Proceedings of the 27th ACM International Conference on Multimedia, Nice, France, 21–25 October 2019; pp. 2292–2295. [Google Scholar] [CrossRef]
- Biljecki, F.; Ito, K. Street view imagery in urban analytics and GIS: A review. Landsc. Urban Plan. 2021, 215, 104217. [Google Scholar] [CrossRef]
- Qi, M.; Hankey, S. Using street view imagery to predict street-level particulate air pollution. Environ. Sci. Technol. 2021, 55, 2695–2704. [Google Scholar] [CrossRef]
- Graham, M.; Zook, M.; Boulton, A. Augmented reality in urban places: Contested content and the duplicity of code. In Machine Learning and the City: Applications in Architecture and Urban Design; John Wiley & Sons: Hoboken, NJ, USA, 2022; pp. 341–366. [Google Scholar] [CrossRef]
- Chalhoub, J.; Ayer, S.K. Exploring the performance of an augmented reality application for construction layout tasks. Multimed. Tools Appl. 2019, 78, 35075–35098. [Google Scholar] [CrossRef]
- Jeffri, N.F.S.; Rambli, D.R.A. A review of augmented reality systems and their effects on mental workload and task performance. Heliyon 2021, 7, e06277. [Google Scholar] [CrossRef]
- Merino, L.; Schwarzl, M.; Kraus, M.; Sedlmair, M.; Schmalstieg, D.; Weiskopf, D. Evaluating mixed and augmented reality: A systematic literature review (2009–2019). In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Porto de Galinhas, Brazil, 9–13 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 438–451. [Google Scholar] [CrossRef]
- Papakostas, C.; Troussas, C.; Krouska, A.; Sgouropoulou, C. Measuring user experience, usability and interactivity of a personalized mobile augmented reality training system. Sensors 2021, 21, 3888. [Google Scholar] [CrossRef]
- Spittle, B.; Frutos-Pascual, M.; Creed, C.; Williams, I. A review of interaction techniques for immersive environments. IEEE Trans. Vis. Comput. Graph. 2022, 29, 3900–3921. [Google Scholar] [CrossRef]
- Brata, K.C.; Funabiki, N.; Sukaridhoto, S.; Fajrianti, E.D.; Mentari, M. An Investigation of Running Load Comparisons of ARCore on Native Android and Unity for Outdoor Navigation System Using Smartphone. In Proceedings of the 2023 Sixth International Conference on Vocational Education and Electrical Engineering (ICVEE), Surabaya, Indonesia, 14–15 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 133–138. [Google Scholar] [CrossRef]
- Brata, K.C.; Funabiki, N.; Panduman, Y.Y.F.; Fajrianti, E.D. An Enhancement of Outdoor Location-Based Augmented Reality Anchor Precision through VSLAM and Google Street View. Sensors 2024, 24, 1161. [Google Scholar] [CrossRef]
- Panduman, Y.Y.F.; Funabiki, N.; Puspitaningayu, P.; Kuribayashi, M.; Sukaridhoto, S.; Kao, W.C. Design and implementation of SEMAR IOT server platform with applications. Sensors 2022, 22, 6436. [Google Scholar] [CrossRef]
- Bhandary, S.K.; Dhakal, R.; Sanghavi, V.; Verkicharla, P.K. Ambient light level varies with different locations and environmental conditions: Potential to impact myopia. PLoS ONE 2021, 16, e0254027. [Google Scholar] [CrossRef]
- Do, T.H.; Yoo, M. Performance analysis of visible light communication using CMOS sensors. Sensors 2016, 16, 309. [Google Scholar] [CrossRef] [PubMed]
- Preto, S.; Gomes, C.C. Lighting in the workplace: Recommended illuminance (LUX) at workplace environs. In Advances in Design for Inclusion: Proceedings of the AHFE 2018 International Conference on Design for Inclusion, Loews Sapphire Falls Resort at Universal Studios, Orlando, FL, USA, 21–25 July 2018; Springer: Cham, Swizerland, 2019; pp. 180–191. [Google Scholar] [CrossRef]
- Kwak, S. Are only p-values less than 0.05 significant? A p-value greater than 0.05 is also significant! J. Lipid Atheroscler. 2023, 12, 89. [Google Scholar] [CrossRef] [PubMed]
- Michael, P.R.; Johnston, D.E.; Moreno, W. A conversion guide: Solar irradiance and lux illuminance. J. Meas. Eng. 2020, 8, 153–166. [Google Scholar] [CrossRef]
- Zhao, L.; Dong, B.; Li, W.; Zhang, H.; Zheng, Y.; Tang, C.; Hu, B.; Yuan, S. Smartphone-based quantitative fluorescence detection of flowing droplets using embedded ambient light sensor. IEEE Sens. J. 2020, 21, 4451–4461. [Google Scholar] [CrossRef]
- Andreou, A.; Mavromoustakis, C.X.; Batalla, J.M.; Markakis, E.K.; Mastorakis, G.; Mumtaz, S. UAV Trajectory Optimisation in Smart Cities using Modified A* Algorithm Combined with Haversine and Vincenty Formulas. IEEE Trans. Veh. Technol. 2023, 72, 9757–9769. [Google Scholar] [CrossRef]
- Cao, H.; Wang, Y.; Bi, J.; Xu, S.; Si, M.; Qi, H. Indoor positioning method using WiFi RTT based on LOS identification and range calibration. ISPRS Int. J. Geo-Inf. 2020, 9, 627. [Google Scholar] [CrossRef]
- Juárez-Tárraga, F.; Perles-Ribes, J.F.; Ramón-Rodríguez, A.B.; Cárdenas, E. Confidence intervals as a tool to determine the thresholds of the life cycle of destinations. Curr. Issues Tour. 2023, 26, 3923–3928. [Google Scholar] [CrossRef]
- Turner, D.P.; Deng, H.; Houle, T.T. Understanding and Applying Confidence Intervals. Headache J. Head Face Pain 2020, 60, 2118–2124. [Google Scholar] [CrossRef] [PubMed]
- Daugaard, S.; Markvart, J.; Bonde, J.P.; Christoffersen, J.; Garde, A.H.; Hansen, Å.M.; Schlünssen, V.; Vestergaard, J.M.; Vistisen, H.T.; Kolstad, H.A. Light exposure during days with night, outdoor, and indoor work. Ann. Work. Expo. Health 2019, 63, 651–665. [Google Scholar] [CrossRef]
- Zhu, K.; Liu, S.; Sun, W.; Yuan, Y.; Wu, Y. A Lighting Consistency Technique for Outdoor Augmented Reality Systems Based on Multi-Source Geo-Information. ISPRS Int. J. Geo-Inf. 2023, 12, 324. [Google Scholar] [CrossRef]
Specification | Details |
---|---|
Model | Sony Xperia XZ1 G8342 |
Android Version | Android 9.0 (Pie) |
Resolution | 1080 × 1920 pixels, 16:9 ratio (424 ppi density) |
Processor | Octa-core (4 × 2.45 GHz Kryo, 4 × 1.9 GHz Kryo) |
Chipset | Qualcomm MSM8998 Snapdragon 835 (10 nm) |
Storage | 64 GB |
Battery | Li-Ion 2700 mAh |
Available Sensors | GPS, Accelerometer, Gyroscope, Compass, Barometer, Proximity, and Ambient Light |
Natural Light Conditions | Locations | n | Illuminance Level (lx) Mean/SD | Distance Error (Meters) Mean/SD | Horizontal Surface Tracking (Second) Mean/SD |
---|---|---|---|---|---|
Daylight | POI 1 | 5 | 5114.20/1453.15 | 0.71/0.06 | 0.73/0.05 |
POI 2 | 5 | 4573.80/1253.01 | 0.84/0.05 | 0.71/0.09 | |
POI 3 | 5 | 4078.00/1732.91 | 0.92/0.06 | 0.57/0.06 | |
POI 4 | 5 | 6455.40/1558.50 | 0.82/0.11 | 0.56/0.07 | |
POI 5 | 5 | 4800.60/1415.47 | 0.81/0.02 | 0.64/0.08 | |
Day Overcast | POI 1 | 5 | 669.20/140.23 | 0.79/0.04 | 0.63/0.09 |
POI 2 | 5 | 733.40/261.71 | 0.84/0.05 | 0.67/0.09 | |
POI 3 | 5 | 735.60/323.05 | 0.86/0.08 | 0.62/0.07 | |
POI 4 | 5 | 1026.40/664.74 | 0.81/0.10 | 0.64/0.08 | |
POI 5 | 5 | 798.40/245.15 | 0.81/0.02 | 0.61/0.06 | |
Day Rainy | POI 1 | 5 | 541.80/121.30 | 0.85 /0.03 | 1.38/0.38 |
POI 2 | 5 | 457.00/103.11 | 0.88/0.05 | 1.29/0.40 | |
POI 3 | 5 | 455.00/148.00 | 0.92/0.05 | 1.49/0.61 | |
POI 4 | 5 | 452.40/144.69 | 0.87/0.05 | 1.32/0.14 | |
POI 5 | 5 | 504.20/112.87 | 0.78/0.01 | 1.56/0.17 | |
Low Light | POI 1 | 5 | 177.60/94.75 | 9.14/0.68 | 6.78/4.60 |
POI 2 | 5 | 146.00/69.38 | 8.96/0.42 | 6.95/4.33 | |
POI 3 | 5 | 80.60/67.74 | 9.82/0.52 | 9.86/4.97 | |
POI 4 | 5 | 152.00/83.02 | 9.37/0.74 | 7.55/7.00 | |
POI 5 | 5 | 129.60/58.98 | 9.75/0.29 | 7.24/5.83 |
Illuminance (lx) | ||||||
---|---|---|---|---|---|---|
Natural Light Conditions | n | Lower–Upper | Confidence Interval | Mean/SD | Distance Error Mean (Meters) | Horizontal Surface Tracking Mean (Second) |
Daylight | 25 | 4382–5627 | 622.28 | 5004.40/1587.47 | 0.82 | 0.64 |
Day Overcast | 25 | 650–935 | 142.27 | 792.60 /362.94 | 0.82 | 0.63 |
Day Rainy | 25 | 434–530 | 47.69 | 482.08/121.65 | 0.86 | 1.41 |
Low Light | 25 | 107–167 | 30.04 | 137.16/76.63 | 9.41 | 7.68 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Brata, K.C.; Funabiki, N.; Riyantoko, P.A.; Panduman, Y.Y.F.; Mentari, M. Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions. Electronics 2024, 13, 2930. https://doi.org/10.3390/electronics13152930
Brata KC, Funabiki N, Riyantoko PA, Panduman YYF, Mentari M. Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions. Electronics. 2024; 13(15):2930. https://doi.org/10.3390/electronics13152930
Chicago/Turabian StyleBrata, Komang Candra, Nobuo Funabiki, Prismahardi Aji Riyantoko, Yohanes Yohanie Fridelin Panduman, and Mustika Mentari. 2024. "Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions" Electronics 13, no. 15: 2930. https://doi.org/10.3390/electronics13152930
APA StyleBrata, K. C., Funabiki, N., Riyantoko, P. A., Panduman, Y. Y. F., & Mentari, M. (2024). Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions. Electronics, 13(15), 2930. https://doi.org/10.3390/electronics13152930