Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces
Abstract
:1. Introduction
- 1.
- To what extent can the method of simulating wearable urban AR experiences in VR elicit insights into the functional benefits of AR concepts and the impact of urban contextual factors?
- 2.
- What are the limitations of simulating wearable urban AR experiences in VR?
2. Related Work
2.1. Urban AR Applications
2.2. Simulating Wearable AR Experiences in VR
3. Case Studies
3.1. Pedestrian Navigation
3.1.1. Design Context
3.1.2. Prototype Development
3.1.3. Evaluation Study
3.2. Interaction with Autonomous Vehicles
3.2.1. Design Context
3.2.2. Prototype Development
3.2.3. Evaluation Study
4. Data Analysis
5. Results
5.1. Participants’ Feedback on AR Prototypes
5.1.1. AR Glasses
5.1.2. Functionality
5.1.3. User Experience
5.1.4. Information
5.1.5. Visualisation
5.1.6. Sound
5.1.7. Spatial Positioning
5.1.8. Interactivity
5.2. Participants’ Behaviour in Immersive Virtual Environments
6. Discussion
6.1. VR Simulation Efficacy in Evaluating Wearable Urban AR Applications (RQ1)
Evaluating the Impact of Urban Contextual Factors
- (1)
- Safety concerns: The simulated context heightened the participants’ awareness about their safety, prompting them to take note of the safety aspects of the design concepts (as in the navigation study) and their feeling of safety (as in the AV study). For instance, most participants in the navigation study expected to use the map interface while walking and, therefore, disfavoured a design that might obscure the real world or distract them from their surroundings. As a result, researchers can utilise VR to experiment with different spatial arrangements of AR content or even give users the option to customise the placement in ways that best fit the situational context.
- (2)
- Attentional capacity: User statements suggested that the sheer quantity of visual distractions in the urban setting (e.g., two-way traffic) makes it more likely for them to overlook conformal AR contents, particularly those positioned at the periphery (e.g., car overlays). The divided attention exemplifies how VR prototyping may aid in the discovery of usability issues that might arise during in-situ deployments. Furthermore, simulated environmental distractions help determine whether the AR content might exceed the users’ capabilities and cognitive demand in a specific situation. For example, the potential information overload in AV–pedestrian communication was examined in a multi-vehicle scenario, and the obtained feedback revealed the relevance of each information cue. Notably, much of the feedback was influenced by the temporal and spatial awareness of the virtual environment. For example, many participants expressed their fear of not knowing how close the vehicles would stop from them, with one participant even hesitating to cross the road when the zebra crossing appeared before the AVs had come to a complete stop.
- (3)
- Social considerations: Participants were considerate of the potential adverse effects of their AR usage on others; for example, one participant in the AV study wanted to ensure that traffic movement would resume after he had crossed the street (A21). However, it should be noted that the number of social considerations related to interaction techniques was relatively small. This result could be attributed to the simple interaction in the AV study and the fact that only interactions with the on-hand map in the navigation study involved hand movement.
6.2. VR Simulation Limitations (RQ2)
- Examining visual fidelity: Regarding AR visualisations, the prototypes conveyed a rather sophisticated appearance, inducing user feedback on numerous visual properties. However, the validity of the findings might be called into doubt when qualities such as colour, transparency, and text legibility are concerned. For example, while feedback about how the map was not transparent enough and the place-mark icons were hard to read aided in identifying usability issues, these observed issues may have been partly caused by the VR system (e.g., display resolution of the headset used). In addition, there may be issues that will not manifest under simulated conditions because the influencing factors, such as outdoor lighting [47] and the visual complexity of background textures [79], cannot be realistically recreated. Further, an interview study with industry practitioners reported that the targeted display technology could also tamper with the colours of an AR application [78], necessitating the testing of colour variations via actual deployment on the target device. For these reasons, a simulated AR prototype may not be the most suited manifestation to examine the fine-grained visual qualities of AR interfaces and related usability issues (e.g., distinguishing colours [71]).
- Producing exhaustive contextual feedback: Although the VR simulation provided in-context feedback on wearable urban AR applications, it is important to note that the method should only be used as a first step to narrow down potential concepts and it does not replace field experiments for a number of reasons. First, prototyping and evaluating wearable AR applications in VR is a highly controlled activity by its very nature. This means that not only the application and the intended use environment are simulated, but also the instruction and tasks, resulting in a noisy approximation of how people actually use a product. In this regard, in-the-wild AR studies can be a more effective approach to understanding how an AR application is used in authentic everyday situations; for example, the participants in the study by Lu and Bowman [80] used a working prototype in their own contexts for at least 2.5 h unsupervised. Second, it is nearly impossible to replicate the messiness and variability of the real world and its potential effects. For example, in a field trial of a mobile AR map, Morrison et al. [81] found that the sunlight and sunshade of the sun influenced the way participants held the device to view the AR visualisations. Therefore, VR bridges the gap between ecological validity and experimental control, but does not eliminate the ‘hassle’ of conducting field studies [82,83].
6.3. Recommendations for Simulating Wearable Urban AR Experiences in VR
6.3.1. Experience of Wearing Smartglasses
- R1—Emphasising the experience of wearing smartglasses. Tapping is one of the most common hand gestures that allow users to engage with AR content. Smartglasses users can either perform air-tap gestures (as in HoloLens applications) or tap a touchpad on the temple (as in Google Glass applications). In the AV study, we implemented the latter mainly because it involves physical contact with the headset, which serves to emphasise the experience of wearing smartglasses. The gesture was natural, easy to implement, and contributed to greater user feedback on the AR glasses in the AV study. Yet, one disadvantage of this technique is that it is not applicable when AR interaction techniques [44,84] are the focus of the investigation.
6.3.2. AR Content Rendering
- R2—Making AR content stand out: To differentiate AR imagery from virtual environments, we created AR materials in emissive and translucent colours, resembling interfaces typically seen in science fiction films. To strengthen the effect, we propose increasing the realism of the environment with high-poly 3D models while using lower poly meshes for AR elements. However, because there is a trade-off between using complex mesh models and maintaining high simulation performance, particular attention must be paid to different VR optimisation techniques.
- R3—Simulating registration inaccuracy: A key issue of existing AR HMDs is the lack of image registration accuracy, which results in misalignments between the augmented imagery and physical objects [6]. While VR usage alleviates this problem, it was found during pilot tests that a perfect registration made distinguishing between AR and VR elements challenging. Therefore, we deliberately simulated a small degree of registration inaccuracy, for example, creating a noticeable gap between the car overlay and the car body. Participants in the pilot tests specifically commented on the usefulness of this technique in recognising digital overlays.
6.3.3. Contextual Setting
- R4—Determining the social and environmental factors to be incorporated: Our findings demonstrate that the simulation of social and environmental factors frequently found in an urban setting, such as road traffic and human activities, contributed to participants’ sense of presence. These factors are particularly critical when individuals are exposed to the virtual environment for an extended period (e.g., navigating or exploring the city). In addition to improving the experience in VR, however, the overall rationale for incorporating social and environmental factors should be to better assess their influence on the usability of urban AR applications. For example, in the navigation study, participants referring to background scenes we deliberately created (e.g., a policeman running after a thief) provided us with implicit but valuable feedback that our AR application offered sufficient situational awareness.
- R5—Incorporating different levels of details: The extent to which contextual factors are modelled in detail, we argue, should vary according to their role and possible impact on the urban AR experience under investigation. Vehicle behaviour, for example, was not replicated as precisely in the navigation research as in the AV study because participants were not meant to engage directly with road traffic. Rather than managing every driving parameter (e.g., speed and deceleration rate), we used the built-in navigation system of Unity to fill the city with intelligent car agents, lowering the time and effort required to build the prototype. This also conforms with what Lim et al. [19] more broadly refers to as the economic principle of prototyping.
6.4. Limitations and Future Work
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Azuma, R.T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Paavilainen, J.; Korhonen, H.; Alha, K.; Stenros, J.; Koskinen, E.; Mayra, F. The Pokémon GO experience: A location-based augmented reality mobile game goes mainstream. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 2493–2498. [Google Scholar]
- Azuma, R.T. The road to ubiquitous consumer augmented reality systems. Hum. Behav. Emerg. Technol. 2019, 1, 26–32. [Google Scholar] [CrossRef] [Green Version]
- Dey, A.; Billinghurst, M.; Lindeman, R.W.; Swan, J. A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Front. Robot. AI 2018, 5, 37. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Azuma, R.T. The challenge of making augmented reality work outdoors. Mix. Real. Merging Real Virtual Worlds 1999, 1, 379–390. [Google Scholar]
- Billinghurst, M. Grand Challenges for Augmented Reality. Front. Virtual Real. 2021, 2, 12. [Google Scholar] [CrossRef]
- Voit, A.; Mayer, S.; Schwind, V.; Henze, N. Online, VR, AR, Lab, and In-Situ: Comparison of Research Methods to Evaluate Smart Artifacts. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–12. [Google Scholar]
- Mäkelä, V.; Radiah, R.; Alsherif, S.; Khamis, M.; Xiao, C.; Borchert, L.; Schmidt, A.; Alt, F. Virtual Field Studies: Conducting Studies on Public Displays in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI ’20, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–15. [Google Scholar] [CrossRef]
- Yao, T.; Yoo, S.; Parker, C. Evaluating Virtual Reality as a Tool for Empathic Modelling of Vision Impairment. In Proceedings of the OzCHI ’21, Melbourne, VIC, Australia, 30 November –2 December 2021. [Google Scholar]
- Shah, S.; Dey, D.; Lovett, C.; Kapoor, A. Airsim: High-fidelity visual and physical simulation for autonomous vehicles. In Proceedings of the 11th Conference on Field and Service Robotics, Zurich, Switzerland, 12–15 September 2017; Springer: Cham, Switzerland, 2018; pp. 621–635. [Google Scholar]
- Tran, T.T.M.; Parker, C.; Tomitsch, M. A review of virtual reality studies on autonomous vehicle–pedestrian interaction. IEEE Trans. Hum. Mach. Syst. 2021, 51, 641–652. [Google Scholar] [CrossRef]
- Colley, M.; Eder, B.; Rixen, J.O.; Rukzio, E. Effects of Semantic Segmentation Visualization on Trust, Situation Awareness, and Cognitive Load in Highly Automated Vehicles. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–11. [Google Scholar]
- Kim, S.; Dey, A.K. Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; pp. 133–142. [Google Scholar]
- Jose, R.; Lee, G.A.; Billinghurst, M. A comparative study of simulated augmented reality displays for vehicle navigation. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, Launceston, TAS, Australia, 29 November–2 December 2016; pp. 40–48. [Google Scholar]
- Riva, G.; Mantovani, F.; Capideville, C.S.; Preziosa, A.; Morganti, F.; Villani, D.; Gaggioli, A.; Botella, C.; Alcañiz, M. Affective interactions using virtual reality: The link between presence and emotions. Cyberpsychology Behav. 2007, 10, 45–56. [Google Scholar] [CrossRef] [PubMed]
- Deb, S.; Carruth, D.W.; Sween, R.; Strawderman, L.; Garrison, T.M. Efficacy of virtual reality in pedestrian safety research. Appl. Ergon. 2017, 65, 449–460. [Google Scholar] [CrossRef]
- Grandi, J.G.; Cao, Z.; Ogren, M.; Kopper, R. Design and Simulation of Next-Generation Augmented Reality User Interfaces in Virtual Reality. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, Portugal, 27 March–1 April 2021; pp. 23–29. [Google Scholar]
- Pratticò, F.G.; Lamberti, F.; Cannavò, A.; Morra, L.; Montuschi, P. Comparing State-of-the-Art and Emerging Augmented Reality Interfaces for Autonomous Vehicle-to-Pedestrian Communication. IEEE Trans. Veh. Technol. 2021, 70, 1157–1168. [Google Scholar] [CrossRef]
- Lim, Y.K.; Stolterman, E.; Tenenberg, J. The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas. ACM Trans. Comput. Hum. Interact. (TOCHI) 2008, 15, 7. [Google Scholar] [CrossRef]
- Buchenau, M.; Suri, J.F. Experience prototyping. In Proceedings of the 3rd Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, New York City, NY, USA, 17–19 August 2000; pp. 424–433. [Google Scholar]
- Bowman, D.A.; Stinson, C.; Ragan, E.D.; Scerbo, S.; Höllerer, T.; Lee, C.; McMahan, R.P.; Kopper, R. Evaluating effectiveness in virtual environments with MR simulation. In Proceedings of the Interservice/Industry Training, Simulation, and Education Conference, Orlando, FL, USA, 3–6 December 2012; Volume 4, p. 44. [Google Scholar]
- Lee, C.; Bonebrake, S.; Hollerer, T.; Bowman, D.A. A replication study testing the validity of ar simulation in vr for controlled experiments. In Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Orlando, FL, USA, 19–22 October 2009; pp. 203–204. [Google Scholar]
- Lee, C.; Bonebrake, S.; Bowman, D.A.; Höllerer, T. The role of latency in the validity of AR simulation. In Proceedings of the 2010 IEEE Virtual Reality Conference (VR), Boston, MA, USA, 20–24 March 2010; pp. 11–18. [Google Scholar]
- UN ECOSOC. The UNECE–ITU Smart Sustainable Cities Indicators; UN ECOSOC: New York City, NY, USA; Geneva, Switzerland, 2015.
- Tomitsch, M. Making Cities Smarter; JOVIS Verlag GmbH: Berlin, Germany, 2017. [Google Scholar]
- Narzt, W.; Pomberger, G.; Ferscha, A.; Kolb, D.; Müller, R.; Wieghardt, J.; Hörtner, H.; Lindinger, C. Augmented reality navigation systems. Univers. Access Inf. Soc. 2006, 4, 177–187. [Google Scholar] [CrossRef]
- Jingen Liang, L.; Elliot, S. A systematic review of augmented reality tourism research: What is now and what is next? Tour. Hosp. Res. 2021, 21, 15–30. [Google Scholar] [CrossRef]
- Parker, C.; Tomitsch, M.; Kay, J.; Baldauf, M. Keeping it private: An augmented reality approach to citizen participation with public displays. In Proceedings of the Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, Osaka, Japan, 7–11 September 2015; pp. 807–812. [Google Scholar]
- Riegler, A.; Riener, A.; Holzmann, C. A Research Agenda for Mixed Reality in Automated Vehicles. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia, Essen, Germany, 22–25 November 2020; pp. 119–131. [Google Scholar]
- Simmons, S.M.; Caird, J.K.; Ta, A.; Sterzer, F.; Hagel, B.E. Plight of the distracted pedestrian: A research synthesis and meta-analysis of mobile phone use on crossing behaviour. Inj. Prev. 2020, 26, 170–176. [Google Scholar] [CrossRef] [PubMed]
- Dünser, A.; Billinghurst, M.; Wen, J.; Lehtinen, V.; Nurminen, A. Exploring the use of handheld AR for outdoor navigation. Comput. Graph. 2012, 36, 1084–1095. [Google Scholar] [CrossRef]
- Rauschnabel, P.A.; Ro, Y.K. Augmented reality smart glasses: An investigation of technology acceptance drivers. Int. J. Technol. Mark. 2016, 11, 123–148. [Google Scholar] [CrossRef] [Green Version]
- Rauschnabel, P.A.; Brem, A.; Ivens, B.S. Who will buy smart glasses? Empirical results of two pre-market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Comput. Hum. Behav. 2015, 49, 635–647. [Google Scholar] [CrossRef]
- Javornik, A. Directions for studying user experience with augmented reality in public. In Augmented Reality and Virtual Reality; Springer: Berlin/Heidelberg, Germany, 2018; pp. 199–210. [Google Scholar]
- Publishing, O.; Forum, I.T.; Forum, I.T. Pedestrian Safety, Urban Space and Health; Organisation for Economic Co-Operation and Development: Paris, France, 2012. [Google Scholar]
- Aromaa, S.; Väätänen, A.; Aaltonen, I.; Goriachev, V.; Helin, K.; Karjalainen, J. Awareness of the real-world environment when using augmented reality head-mounted display. Appl. Ergon. 2020, 88, 103145. [Google Scholar] [CrossRef]
- Hsieh, Y.T.; Jylhä, A.; Orso, V.; Gamberini, L.; Jacucci, G. Designing a willing-to-use-in-public hand gestural interaction technique for smart glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 4203–4215. [Google Scholar]
- Nebeling, M.; Madier, K. 360proto: Making interactive virtual reality & augmented reality prototypes from paper. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar]
- Pfeiffer-Leßmann, N.; Pfeiffer, T. ExProtoVAR: A lightweight tool for experience-focused prototyping of augmented reality applications using virtual reality. In Proceedings of the International Conference on Human-Computer Interaction, Las Vegas, NV, USA, 15–20 July 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 311–318. [Google Scholar]
- Berning, M.; Yonezawa, T.; Riedel, T.; Nakazawa, J.; Beigl, M.; Tokuda, H. pARnorama: 360 degree interactive video for augmented reality prototyping. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, 8–12 September 2013; pp. 1471–1474. [Google Scholar]
- Freitas, G.; Pinho, M.S.; Silveira, M.S.; Maurer, F. A Systematic Review of Rapid Prototyping Tools for Augmented Reality. In Proceedings of the 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 199–209. [Google Scholar]
- Grubert, J.; Langlotz, T.; Zollmann, S.; Regenbrecht, H. Towards pervasive augmented reality: Context-awareness in augmented reality. IEEE Trans. Vis. Comput. Graph. 2016, 23, 1706–1724. [Google Scholar] [CrossRef]
- Gruenefeld, U.; Auda, J.; Mathis, F.; Schneegass, S.; Khamis, M.; Gugenheimer, J.; Mayer, S. VRception: Rapid Prototyping of Cross-Reality Systems in Virtual Reality. In Proceedings of the CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–15. [Google Scholar]
- Alce, G.; Hermodsson, K.; Wallergård, M.; Thern, L.; Hadzovic, T. A prototyping method to simulate wearable augmented reality interaction in a virtual environment—A pilot study. Int. J. Virtual Worlds Hum. Comput. Interact. 2015, 3, 18–28. [Google Scholar] [CrossRef] [Green Version]
- Burova, A.; Mäkelä, J.; Hakulinen, J.; Keskinen, T.; Heinonen, H.; Siltanen, S.; Turunen, M. Utilizing VR and gaze tracking to develop AR solutions for industrial maintenance. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
- Bailie, T.; Martin, J.; Aman, Z.; Brill, R.; Herman, A. Implementing user-centered methods and virtual reality to rapidly prototype augmented reality tools for firefighters. In Proceedings of the 10th International Conference on Augmented Cognition, Toronto, ON, Canada, 17–22 July 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 135–144. [Google Scholar]
- Gabbard, J.L.; Swan, J.E.; Hix, D. The effects of text drawing styles, background textures, and natural lighting on text legibility in outdoor augmented reality. Presence 2006, 15, 16–32. [Google Scholar] [CrossRef]
- Lu, F.; Xu, Y. Exploring Spatial UI Transition Mechanisms with Head-Worn Augmented Reality. In Proceedings of the CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–16. [Google Scholar]
- Hassenzahl, M.; Tractinsky, N. User experience—A research agenda. Behav. Inf. Technol. 2006, 25, 91–97. [Google Scholar] [CrossRef]
- Thi Minh Tran, T.; Parker, C. Designing exocentric pedestrian navigation for AR head mounted displays. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–8. [Google Scholar]
- Tran, T.T.M.; Parker, C.; Wang, Y.; Tomitsch, M. Designing Wearable Augmented Reality Concepts to Support Scalability in Autonomous Vehicle–Pedestrian Interaction. Front. Comput. Sci. 2022, 4, 866516. [Google Scholar] [CrossRef]
- Trepkowski, C.; Eibich, D.; Maiero, J.; Marquardt, A.; Kruijff, E.; Feiner, S. The effect of narrow field of view and information density on visual search performance in augmented reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 575–584. [Google Scholar]
- Lee, J.; Jin, F.; Kim, Y.; Lindlbauer, D. User Preference for Navigation Instructions in Mixed Reality. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand, 12–16 March 2022; pp. 802–811. [Google Scholar]
- Zhao, Y.; Kupferstein, E.; Rojnirun, H.; Findlater, L.; Azenkot, S. The effectiveness of visual and audio wayfinding guidance on smartglasses for people with low vision. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–14. [Google Scholar]
- Joy, P.C. This Is What the World Looks Like through Google Glass; 2013. [Google Scholar]
- Goldiez, B.F.; Ahmad, A.M.; Hancock, P.A. Effects of augmented reality display settings on human wayfinding performance. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2007, 37, 839–845. [Google Scholar] [CrossRef]
- Lehikoinen, J.; Suomela, R. WalkMap: Developing an augmented reality map application for wearable computers. Virtual Real. 2002, 6, 33–44. [Google Scholar] [CrossRef]
- Oculus. Introducing Oculus Quest, Our First 6DOF All-In-One VR System; Oculus VR: Irvine, CA, USA, 2019. [Google Scholar]
- Souman, J.L.; Giordano, P.R.; Schwaiger, M.; Frissen, I.; Thümmel, T.; Ulbrich, H.; Luca, A.D.; Bülthoff, H.H.; Ernst, M.O. CyberWalk: Enabling unconstrained omnidirectional walking through virtual environments. ACM Trans. Appl. Percept. (TAP) 2011, 8, 1–22. [Google Scholar] [CrossRef]
- Jayaraman, S.K.; Creech, C.; Robert Jr, L.P.; Tilbury, D.M.; Yang, X.J.; Pradhan, A.K.; Tsui, K.M. Trust in AV: An uncertainty reduction model of AV-pedestrian interactions. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 133–134. [Google Scholar]
- Boletsis, C.; Cedergren, J.E. VR locomotion in the new era of virtual reality: An empirical comparison of prevalent techniques. Adv. Hum. Comput. Interact. 2019, 2019, 7420781. [Google Scholar] [CrossRef]
- Di Luca, M.; Seifi, H.; Egan, S.; Gonzalez-Franco, M. Locomotion vault: The extra mile in analyzing vr locomotion techniques. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–10. [Google Scholar]
- Miguel-Alonso, I.; Rodriguez-Garcia, B.; Checa, D.; De Paolis, L.T. Developing a Tutorial for Improving Usability and User Skills in an Immersive Virtual Reality Experience. In Proceedings of the International Conference on Extended Reality, Lecce, Italy, 6–8 July 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 63–78. [Google Scholar]
- Rouchitsas, A.; Alm, H. External human–machine interfaces for autonomous vehicle-to-pedestrian communication: A review of empirical work. Front. Psychol. 2019, 10, 2757. [Google Scholar] [CrossRef]
- Dey, D.; Habibovic, A.; Löcken, A.; Wintersberger, P.; Pfleging, B.; Riener, A.; Martens, M.; Terken, J. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transp. Res. Interdiscip. Perspect. 2020, 7, 100174. [Google Scholar] [CrossRef]
- Colley, M.; Walch, M.; Rukzio, E. Unveiling the Lack of Scalability in Research on External Communication of Autonomous Vehicles. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery, CHI EA ’20, Honolulu, HI, USA, 25–30 April 2020; pp. 1–9. [Google Scholar] [CrossRef]
- Tabone, W.; de Winter, J.; Ackermann, C.; Bärgman, J.; Baumann, M.; Deb, S.; Emmenegger, C.; Habibovic, A.; Hagenzieker, M.; Hancock, P.; et al. Vulnerable road users and the coming wave of automated vehicles: Expert perspectives. Transp. Res. Interdiscip. Perspect. 2021, 9, 100293. [Google Scholar] [CrossRef]
- Hesenius, M.; Börsting, I.; Meyer, O.; Gruhn, V. Don’t panic! guiding pedestrians in autonomous traffic with augmented reality. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, Barcelona, Spain, 3–6 September 2018; pp. 261–268. [Google Scholar]
- Tabone, W.; Happee, R.; García, J.; Lee, Y.M.; Lupetti, M.L.; Merat, N.; de Winter, J. Augmented Reality Interfaces for Pedestrian-Vehicle Interactions: An Online Study; 2022. [Google Scholar]
- Tonguz, O.; Zhang, R.; Song, L.; Jaiprakash, A. System and Method Implementing Virtual Pedestrian Traffic Lights. U.S. Patent Application No. 17/190,983, 25 May 2021. [Google Scholar]
- Hoggenmüller, M.; Tomitsch, M.; Hespanhol, L.; Tran, T.T.M.; Worrall, S.; Nebot, E. Context-Based Interface Prototyping: Understanding the Effect of Prototype Representation on User Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–14. [Google Scholar]
- Schneider, S.; Bengler, K. Virtually the same? Analysing pedestrian behaviour by means of virtual reality. Transp. Res. Part Traffic Psychol. Behav. 2020, 68, 231–256. [Google Scholar] [CrossRef]
- Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
- Golledge, R.G. Wayfinding Behavior: Cognitive Mapping and Other Spatial Processes; JHU Press: Baltimore, MD, USA, 1999. [Google Scholar]
- Schmidt, A.; Herrmann, T. Intervention user interfaces: A new interaction paradigm for automated systems. Interactions 2017, 24, 40–45. [Google Scholar] [CrossRef]
- Rauschnabel, P.A.; Hein, D.W.; He, J.; Ro, Y.K.; Rawashdeh, S.; Krulikowski, B. Fashion or technology? A fashnology perspective on the perception and adoption of augmented reality smart glasses. i-com 2016, 15, 179–194. [Google Scholar] [CrossRef]
- Simeone, A.L.; Cools, R.; Depuydt, S.; Gomes, J.M.; Goris, P.; Grocott, J.; Esteves, A.; Gerling, K. Immersive Speculative Enactments: Bringing Future Scenarios and Technology to Life Using Virtual Reality. In Proceedings of the CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–20. [Google Scholar]
- Krauß, V.; Nebeling, M.; Jasche, F.; Boden, A. Elements of XR Prototyping: Characterizing the Role and Use of Prototypes in Augmented and Virtual Reality Design. In Proceedings of the CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–18. [Google Scholar]
- Merenda, C.; Suga, C.; Gabbard, J.L.; Misu, T. Effects of “Real-World” Visual Fidelity on AR Interface Assessment: A Case Study Using AR Head-Up Display Graphics in Driving. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China, 14–18 October 2019; pp. 145–156. [Google Scholar]
- Lu, F.; Bowman, D.A. Evaluating the potential of glanceable ar interfaces for authentic everyday uses. In Proceedings of the 2021 IEEE Virtual Reality and 3D User Interfaces (VR), Lisboa, Portugal, 27 March–1 April 2021; pp. 768–777. [Google Scholar]
- Morrison, A.; Oulasvirta, A.; Peltonen, P.; Lemmela, S.; Jacucci, G.; Reitmayr, G.; Näsänen, J.; Juustila, A. Like bees around the hive: A comparative study of a mobile augmented reality map. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; pp. 1889–1898. [Google Scholar]
- Kjeldskov, J.; Skov, M.B. Was it worth the hassle? Ten years of mobile HCI research discussions on lab and field evaluations. In Proceedings of the 16th International Conference on Human-Computer Interaction with MOBILE devices & Services, Toronto, ON, Canada, 23–26 September 2014; pp. 43–52. [Google Scholar]
- Rogers, Y.; Connelly, K.; Tedesco, L.; Hazlewood, W.; Kurtz, A.; Hall, R.E.; Hursey, J.; Toscos, T. Why it’s worth the hassle: The value of in-situ studies when designing ubicomp. In Proceedings of the International Conference on Ubiquitous Computing, Innsbruck, Austria, 16–19 September 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 336–353. [Google Scholar]
- Lee, L.H.; Hui, P. Interaction methods for smart glasses: A survey. IEEE Access 2018, 6, 28712–28732. [Google Scholar] [CrossRef]
- Dole, L.; Ju, W. Face and ecological validity in simulations: Lessons from search-and-rescue HRI. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–8. [Google Scholar]
- Witmer, B.G.; Singer, M.J. Measuring presence in virtual environments: A presence questionnaire. Presence 1998, 7, 225–240. [Google Scholar] [CrossRef]
- Flohr, L.A.; Janetzko, D.; Wallach, D.P.; Scholz, S.C.; Krüger, A. Context-based interface prototyping and evaluation for (shared) autonomous vehicles using a lightweight immersive video-based simulator. In Proceedings of the 2020 ACM Designing Interactive Systems Conference, Eindhoven, The Netherlands, 6–10 July 2020; pp. 1379–1390. [Google Scholar]
- Ragan, E.; Wilkes, C.; Bowman, D.A.; Hollerer, T. Simulation of augmented reality systems in purely virtual environments. In Proceedings of the 2009 IEEE Virtual Reality Conference, Lafayette, LA, USA, 14–18 March 2009; pp. 287–288. [Google Scholar]
Pedestrian Navigation | AV Interaction | |
---|---|---|
Number of Conditions | 3 | 4 |
VR Exposure per Condition | 3–5 min | 1–1.5 min |
Movement | Joystick-based | Real walking |
Interaction | Controller | Hand gesture |
AR Content | Maps, turn arrow | Text prompt, crossing cues |
Number of Participants (m/f) | 18 (9/9) | 24 (9/15) |
Previous VR Experience | ||
Never | 2 | 8 |
Less than 5 times | 15 | 13 |
More than 5 times | 1 | 3 |
Study Location | Australia | Vietnam |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tran, T.T.M.; Parker, C.; Hoggenmüller, M.; Hespanhol, L.; Tomitsch, M. Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces. Multimodal Technol. Interact. 2023, 7, 21. https://doi.org/10.3390/mti7020021
Tran TTM, Parker C, Hoggenmüller M, Hespanhol L, Tomitsch M. Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces. Multimodal Technologies and Interaction. 2023; 7(2):21. https://doi.org/10.3390/mti7020021
Chicago/Turabian StyleTran, Tram Thi Minh, Callum Parker, Marius Hoggenmüller, Luke Hespanhol, and Martin Tomitsch. 2023. "Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces" Multimodal Technologies and Interaction 7, no. 2: 21. https://doi.org/10.3390/mti7020021
APA StyleTran, T. T. M., Parker, C., Hoggenmüller, M., Hespanhol, L., & Tomitsch, M. (2023). Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces. Multimodal Technologies and Interaction, 7(2), 21. https://doi.org/10.3390/mti7020021