Interactions in Augmented and Mixed Reality: An Overview
Abstract
:1. Introduction
2. Conceptual Framework Definition
2.1. Interaction Tasks
- Selection: Refers to the task of selecting an object to perform actions, such as retrieving information or storing it as an argument for another action [54].
- Manipulation: Provides to the user the capability of changing any of the object’s attributes, e.g., scale, position etc. [55].
- Navigation: Provides to the user the capability of navigating in an immersive environment by changing position or orientation [56].
- System control: Refers to the user capability of performing changes in the system state, such as menu-based changes [57].
2.2. Modality
- Visual-based: The visual-based modality includes any state changes that a camera sensor can capture, convey meaning and can be used to describe the user’s intention to interact or present visual feedback.
- Audio-based: The audio-based modality contains all actions and feedback that include sound perception and sound stimuli.
- Haptic-based: The haptic-based modality defines all interactions that can be perceived through the sense of touch or executed through graspable-tangible objects.
- Sensor-based: Finally, the sensor-based modality includes all interactions requiring any sensor to capture information regarding an action or transmit feedback back to the user, besides visual, auditory, haptic, taste, and smell inputs/outputs. An example of this modality includes the pressure detection method.
2.3. Context
2.4. Method
3. Research Results
3.1. Visual-Based Modality
- Optical see-through systems (OST): by displaying digital objects in a semi-transparent screen where real objects can be directly perceived through the glass.
- Video see-through systems (VST): by displaying digital objects on a screen together with real objects captured by a camera sensor, commonly used by smartphones in AR.
3.1.1. Gesture-Based
3.1.2. Surface-Based
3.1.3. Marker-Based
3.1.4. Location-Based
3.2. Audio-Based Modality
3.2.1. Sound-Based
3.2.2. Speech-Based
3.2.3. Music-Based
3.2.4. Location-Based
3.3. Haptic-Based Modality
3.3.1. Touch-Based
3.3.2. Marker-Based
3.3.3. Controls-Based
3.3.4. Feedback-Based
3.4. Sensor-Based Modality
3.4.1. Pressure-Based
3.4.2. Motion-Based
3.4.3. Location-Based
3.4.4. IoT-Based
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Olsson, T. Concepts and Subjective Measures for Evaluating User Experience of Mobile Augmented Reality Services. In Human Factors in Augmented Reality Environments; Springer: New York, NY, USA, 2013; pp. 203–232. [Google Scholar]
- Han, D.I.; Tom Dieck, M.C.; Jung, T. User experience model for augmented reality applications in urban heritage tourism. J. Herit. Tour. 2018, 13, 46–61. [Google Scholar] [CrossRef]
- Kling, R. The organizational context of user-centered software designs. MIS Q. 1977, 1, 41–52. [Google Scholar] [CrossRef] [Green Version]
- Norman, D.A.; Draper, S.W. User Centered System Design: New Perspectives on Human-Computer Interaction, 1st ed.; CRC Press: Boca Raton, FL, USA, 1986. [Google Scholar]
- Etherington. Google Launches ‘Live View’ AR Walking Directions for Google Maps. Available online: tinyurl.com/y48mt75e (accessed on 28 July 2021).
- Cordeiro, D.; Correia, N.; Jesus, R. ARZombie: A Mobile Augmented Reality Game with Multimodal Interaction. Proceedings of 2015 7th International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), Torino, Italy, 10–12 June 2015. [Google Scholar]
- Zollhöfer, M.; Thies, J.; Garrido, P.; Bradley, D.; Beeler, T.; Pérez, P.; Stamminger, M.; Nießner, M.; Theobalt, C. State of the art on monocular 3D face reconstruction, tracking, and applications. In Computer Graphics Forum; John Wiley & Sons Ltd.: Hoboken, NJ, USA, 2018; Volume 37, pp. 523–550. [Google Scholar]
- Ladwig, P.; Geiger, C. A Literature Review on Collaboration in Mixed Reality. In International Conference on Remote Engineering and Virtual Instrumentation; Springer: New York, NY, USA, 2018; pp. 591–600. [Google Scholar]
- Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Coutrix, C.; Nigay, L. Mixed Reality: A Model of Mixed Interaction. In Proceedings of the Working Conference on Advanced Visual Interfaces, Venezia, Italy, 23–26 May 2006. [Google Scholar]
- Evangelidis, K.; Papadopoulos, T.; Sylaiou, S. Mixed Reality: A Reconsideration Based on Mixed Objects and Geospatial Modalities. Appl. Sci. 2021, 11, 2417. [Google Scholar] [CrossRef]
- Chen, W. Historical Oslo on a handheld device–a mobile augmented reality application. Procedia Comput. Sci. 2014, 35, 979–985. [Google Scholar] [CrossRef] [Green Version]
- Oleksy, T.; Wnuk, A. Augmented places: An impact of embodied historical experience on attitudes towards places. Comput. Hum. Behav. 2016, 57, 11–16. [Google Scholar] [CrossRef]
- Phithak, T.; Kamollimsakul, S. Korat Historical Explorer: The Augmented Reality Mobile Application to Promote Historical Tourism in Korat. In Proceedings of the 2020 the 3rd International Conference on Computers in Management and Business, Tokyo, Japan, 31 January–2 February 2020; pp. 283–289. [Google Scholar]
- Nguyen, V.T.; Jung, K.; Yoo, S.; Kim, S.; Park, S.; Currie, M. Civil War Battlefield Experience: Historical Event Simulation Using Augmented Reality Technology. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), San Diego, CA, USA, 9–11 December 2019. [Google Scholar]
- Cavallo, M.; Rhodes, G.A.; Forbes, A.G. Riverwalk: Incorporating Historical Photographs in Public Outdoor Augmented Reality Experiences. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Merida, Yucatan, Mexico, 19–23 September 2016. [Google Scholar]
- Angelini, C.; Williams, A.S.; Kress, M.; Vieira, E.R.; D’Souza, N.; Rishe, N.D.; Ortega, F.R. City planning with augmented reality. arXiv 2020, arXiv:2001.06578. [Google Scholar]
- Sihi, D. Home sweet virtual home: The use of virtual and augmented reality technologies in high involvement purchase decisions. J. Res. Interact. Mark. 2018, 12, 398–417. [Google Scholar] [CrossRef]
- Fu, M.; Liu, R. The Application of Virtual Reality and Augmented Reality in Dealing with Project Schedule Risks. In Proceedings of the Construction Research Congress, New Orleans, LA, USA, 2–4 April 2018; pp. 429–438. [Google Scholar]
- Amaguaña, F.; Collaguazo, B.; Tituaña, J.; Aguilar, W.G. Simulation System Based on Augmented Reality for Optimization of Training Tactics on Military Operations. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics; Springer: New York, NY, USA, 2018; pp. 394–403. [Google Scholar]
- Livingston, M.A.; Rosenblum, L.J.; Brown, D.G.; Schmidt, G.S.; Julier, S.J.; Baillot, Y.; Maassel, P. Military Applications of Augmented Reality. In Handbook of Augmented Reality, 2011th ed.; Springer: New York, NY, USA, 2011; pp. 671–706. [Google Scholar]
- Hagan, A. Illusion & Delusion: Living in reality when inventing imaginary worlds. J. Animat. Annu. Creat. Aust. 2015, 75, 75–80. [Google Scholar]
- Ramos, F.; Granell, C.; Ripolles, O. An Architecture for the Intelligent Creation of Imaginary Worlds for Running. In Intelligent Computer Graphics 2012; Springer: Berlin/Heidelberg, Germany, 2013; pp. 209–225. [Google Scholar]
- Akins, H.B.; Smith, D.A. Imaging planets from imaginary worlds. Phys. Teach. 2018, 56, 486–487. [Google Scholar] [CrossRef]
- Bunt, H. Issues in multimodal human-computer communication. In International Conference on Cooperative Multimodal Communication; Springer: Berlin/Heidelberg, Germany, 1995; pp. 1–12. [Google Scholar]
- Quek, F.; McNeill, D.; Bryll, R.; Duncan, S.; Ma, X.F.; Kirbas, C.; Ansari, R. Multimodal human discourse: Gesture and speech. ACM Trans. Comput.-Hum. Interact. (TOCHI) 2002, 9, 171–193. [Google Scholar] [CrossRef]
- Ling, J.; Peng, Z.; Yin, L.; Yuan, X. How Efficiency and Naturalness Change in Multimodal Interaction in Mobile Navigation Apps. In International Conference on Applied Human Factors and Ergonomics; Springer: New York, NY, USA, 2020; pp. 196–207. [Google Scholar]
- Camba, J.; Contero, M.; Salvador-Herranz, G. Desktop vs. Mobile: A Comparative Study of Augmented Reality Systems for Engineering Visualizations in Education. In 2014 IEEE Frontiers in Education Conference (FIE) Proceedings; IEEE: Manhattan, NY, USA, 2014; pp. 1–8. [Google Scholar]
- Bekele, M.K.; Pierdicca, R.; Frontoni, E.; Malinverni, E.S.; Gain, J. A survey of augmented, virtual, and mixed reality for cultural heritage. J. Comput. Cult. Herit. (JOCCH) 2018, 11, 1–36. [Google Scholar] [CrossRef]
- Karray, F.; Alemzadeh, M.; Abou Saleh, J.; Arab, M.N. Human-computer interaction: Overview on state of the art. Int. J. Smart Sens. Intell. Syst. 2017, 1, 137–159. [Google Scholar] [CrossRef] [Green Version]
- Nizam, S.S.M.; Abidin, R.Z.; Hashim, N.C.; Lam, M.C.; Arshad, H.; Majid, N.A.A. A review of multimodal interaction technique in augmented reality environment. Int. J. Adv. Sci. Eng. Inf. Technol. 2018, 8, 1460–1469. [Google Scholar] [CrossRef] [Green Version]
- Saroha, K.; Sharma, S.; Bhatia, G. Human computer interaction: An intellectual approach. IJCSMS Int. J. Comput. Sci. Manag. Stud. 2011, 11, 147–154. [Google Scholar]
- Tektonidis, D.; Koumpis, A. Accessible Internet-of-Things and Internet-of-Content Services for All in the Home or on the Move. Int. J. Interact. Mob. Technol. 2012, 6, 25–33. [Google Scholar] [CrossRef] [Green Version]
- Tektonidis, D.; Karagiannidis, C.; Kouroupetroglou, C.; Koumpis, A. Intuitive User Interfaces to Help Boost Adoption of Internet-of-Things and Internet-of-Content Services for All. In Inter-Cooperative Collective Intelligence: Techniques and Applications; Springer: Berlin/Heidelberg, Germany, 2014; pp. 93–110. [Google Scholar]
- Badhiti, K.R. HCI-Ubiquitous Computing and Ambient Technologies in the Universe. Int. J. Adv. Res. Comput. Sci. Manag. Stud. 2015, 3, 1. [Google Scholar]
- Raymond, O.U.; Ogbonna, A.C.; Shade, K. Human Computer Interaction: Overview and Challenges. 2014. Available online: https://www.researchgate.net/publication/263254929_Human_Computer_InteractionOverview_and_Challenges (accessed on 10 September 2021).
- Ahluwalia, S.; Pal, B.; Wason, R. Gestural Interface Interaction: A Methodical Review. Int. J. Comput. Appl. 2012, 60, 21–28. [Google Scholar]
- Nautiyal, L.; Malik, P.; Ram, M. Computer Interfaces in Diagnostic Process of Industrial Engineering. In Diagnostic Techniques in Industrial Engineering; Springer: New York, NY, USA, 2018; pp. 157–170. [Google Scholar]
- Alao, O.D.; Joshua, J.V. Human Ability Improvement with Wireless Sensors in Human Computer Interaction. Int. J. Comput. Appl. Technol. Res. 2019, 8, 331–339. [Google Scholar]
- Norman, D. The Design of Everyday Things: Revised and Expanded Edition; Basic Books: New York, NY, USA, 2013. [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
- McGann, M. Perceptual modalities: Modes of presentation or modes of interaction? J. Conscious. Stud. 2010, 17, 72–94. [Google Scholar]
- Stokes, D.; Matthen, M.; Biggs, S. (Eds.) Perception and Its Modalities; Oxford University Press: Oxford, UK, 2015. [Google Scholar]
- Pamparău, C.; Vatavu, R.D. A Research Agenda Is Needed for Designing for the User Experience of Augmented and Mixed Reality: A Position Paper. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia, Essen, Germany, 22–25 November 2020; pp. 323–325. [Google Scholar]
- Ghazwani, Y.; Smith, S. Interaction in augmented reality: Challenges to enhance user experience. In Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations, Sydney, Australia, 14–16 February 2020; pp. 39–44. [Google Scholar]
- Irshad, S.; Rambli, D.R.B.A. User Experience of Mobile Augmented Reality: A Review of Studies. In Proceedings of the 2014 3rd International Conference on User Science and Engineering (i-USEr), Shah Alam, Malaysia, 2–5 September 2014; pp. 125–130. [Google Scholar]
- Côté, S.; Trudel, P.; Desbiens, M.; Giguère, M.; Snyder, R. Live Mobile Panoramic High Accuracy Augmented Reality for Engineering and Construction. In Proceedings of the Construction Applications of Virtual Reality (CONVR), London, UK, 30–31 October 2013; pp. 1–10. [Google Scholar]
- Azimi, E.; Qian, L.; Navab, N.; Kazanzides, P. Alignment of the virtual scene to the tracking space of a mixed reality head-mounted display. arXiv 2017, arXiv:1703.05834. [Google Scholar]
- Peng, J. Changing Spatial Boundaries. Available online: http://www.interactivearchitecture.org/changing-spatial-boundaries.html (accessed on 10 September 2021).
- Bill, R.; Cap, C.; Kofahl, M.; Mundt, T. Indoor and outdoor positioning in mobile environments a review and some investigations on wlan positioning. Geogr. Inf. Sci. 2004, 10, 91–98. [Google Scholar]
- Vijayaraghavan, G.; Kaner, C. Bug taxonomies: Use them to generate better tests. Star East 2003, 2003, 1–40. [Google Scholar]
- Lazar, J.; Feng, J.H.; Hochheiser, H. Research Methods in Human-Computer Interaction; Morgan Kaufmann: Burlington, MA, USA, 2017. [Google Scholar]
- Bachmann, D.; Weichert, F.; Rinkenauer, G. Review of three-dimensional human-computer interaction with focus on the leap motion controller. Sensors 2018, 18, 2194. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Franco, J.; Cabral, D. Augmented object selection through smart glasses. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, Pisa, Italy, 26–29 November 2019; pp. 1–5. [Google Scholar]
- Mossel, A.; Venditti, B.; Kaufmann, H. 3DTouch and HOMER-S: Intuitive manipulation techniques for one-handed handheld augmented reality. In Proceedings of the Virtual Reality International Conference: Laval Virtual, Laval, France, 4–6 April 2013; pp. 1–10. [Google Scholar]
- Narzt, W.; Pomberger, G.; Ferscha, A.; Kolb, D.; Müller, R.; Wieghardt, J.; Lindinger, C. Augmented reality navigation systems. Univers. Access Inf. Soc. 2006, 4, 177–187. [Google Scholar] [CrossRef]
- Reeves, L.M.; Lai, J.; Larson, J.A.; Oviatt, S.; Balaji, T.S.; Buisine, S.; Wang, Q.Y. Guidelines for multimodal user interface design. Commun. ACM 2004, 47, 57–59. [Google Scholar] [CrossRef]
- Rokhsaritalemi, S.; Sadeghi-Niaraki, A.; Choi, S.M. A review on mixed reality: Current trends, challenges and prospects. Appl. Sci. 2020, 10, 636. [Google Scholar] [CrossRef] [Green Version]
- Meža, S.; Turk, Ž.; Dolenc, M. Measuring the potential of augmented reality in civil engineering. Adv. Eng. Softw. 2015, 90, 1–10. [Google Scholar] [CrossRef]
- Ellenberger, K. Virtual and augmented reality in public archaeology teaching. Adv. Archaeol. Pract. 2017, 5, 305–309. [Google Scholar] [CrossRef] [Green Version]
- Barsom, E.Z.; Graafland, M.; Schijven, M.P. Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 2016, 30, 4174–4183. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wu, H.K.; Lee, S.W.Y.; Chang, H.Y.; Liang, J.C. Current status, opportunities and challenges of augmented reality in education. Comput. Educ. 2013, 62, 41–49. [Google Scholar] [CrossRef]
- Chen, L.; Francis, K.; Tang, W. Semantic Augmented Reality Environment with Material-Aware Physical Interactions. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; pp. 135–136. [Google Scholar]
- Chen, L.; Tang, W.; John, N.; Wan, T.R.; Zhang, J.J. Context-aware mixed reality: A framework for ubiquitous interaction. arXiv 2018, arXiv:1803.05541. [Google Scholar]
- Serafin, S.; Geronazzo, M.; Erkut, C.; Nilsson, N.C.; Nordahl, R. Sonic interactions in virtual reality: State of the art, current challenges, and future directions. IEEE Comput. Graph. Appl. 2018, 38, 31–43. [Google Scholar] [CrossRef] [PubMed]
- Sporr, A.; Blank-Landeshammer, B.; Kasess, C.H.; Drexler-Schmid, G.H.; Kling, S.; Köfinger, C.; Reichl, C. Extracting boundary conditions for sound propagation calculations using augmented reality. Elektrotechnik Inf. 2021, 138, 197–205. [Google Scholar] [CrossRef]
- Han, C.; Luo, Y.; Mesgarani, N. Real-Time Binaural Speech Separation with Preserved Spatial Cues. In Proceedings of the ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020; pp. 6404–6408. [Google Scholar]
- Rolland, J.P.; Holloway, R.L.; Fuchs, H. Comparison of optical and video see-through, head-mounted displays. In Telemanipulator and Telepresence Technologies. Int. Soc. Opt. Photonics 1995, 2351, 293–307. [Google Scholar]
- Nieters, J. Defining an Interaction Model: The Cornerstone of Application Design. 2012. Available online: https://www.uxmatters.com/mt/archives/2012/01/defining-an-interaction-model-the-cornerstone-of-application-design.php (accessed on 10 September 2021).
- Sostel. Eye-Gaze and Dwell. Available online: https://docs.microsoft.com/en-us/windows/mixed-reality/design/gaze-and-dwell-eyes (accessed on 28 July 2021).
- Ballantyne, M.; Jha, A.; Jacobsen, A.; Hawker, J.S.; El-Glaly, Y.N. Study of Accessibility Guidelines of Mobile Applications. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, Cairo, Egypt, 25–28 November 2018; pp. 305–315. [Google Scholar]
- Piumsomboon, T.; Lee, G.; Lindeman, R.W.; Billinghurst, M. Exploring Natural Eye-Gaze-Based Interaction for Immersive Virtual Reality. In Proceedings of the 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, 18–19 March 2017; pp. 36–39. [Google Scholar]
- Jacob, R.J. What You Look at Is What You Get: Eye Movement-Based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Washington, DC, USA, 1–5 April 1990; pp. 11–18. [Google Scholar]
- Pomplun, M.; Sunkara, S. Pupil Dilation as an Indicator of Cognitive Workload in Human-Computer Interaction. In Proceedings of the International Conference on HCI, Crete, Greece, 22–27 June 2003; Volume 273. [Google Scholar]
- Samara, A.; Galway, L.; Bond, R.; Wang, H. Human-Computer Interaction Task Classification via Visual-Based Input Modalities. In International Conference on Ubiquitous Computing and Ambient Intelligence; Springer: Manhattan, NY, USA, 2017; pp. 636–642. [Google Scholar]
- Bazarevsky, V.; Kartynnik, Y.; Vakunov, A.; Raveendran, K.; Grundmann, M. Blazeface: Sub-millisecond neural face detection on mobile gpus. arXiv 2019, arXiv:1907.05047. [Google Scholar]
- Kantonen, T.; Woodward, C.; Katz, N. Mixed Reality in Virtual World Teleconferencing. In Proceedings of the 2010 IEEE Virtual Reality Conference (VR), Waltham, MA, USA, 20–24 March 2010; pp. 179–182. [Google Scholar]
- Shreyas, K.K.; Rajendran, R.; Wan, Q.; Panetta, K.; Agaian, S.S. TERNet: A Deep Learning Approach for Thermal Face Emotion Recognition. In Proceedings of the SPIE 10993, Mobile Multimedia/Image Processing, Security, and Applications, Baltimore, MD, USA, 4–18 April 2019; Volume 1099309. [Google Scholar]
- Busso, C.; Deng, Z.; Yildirim, S.; Bulut, M.; Lee, C.M.; Kazemzadeh, A.; Narayanan, S. Analysis of Emotion Recognition Using Facial Expressions, Speech and Multimodal Information. In Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, USA, 14–15 October 2004; pp. 205–211. [Google Scholar]
- Acquisti, A.; Gross, R.; Stutzman, F.D. Face recognition and privacy in the age of augmented reality. J. Priv. Confid. 2014, 6, 1. [Google Scholar] [CrossRef]
- Mehta, D.; Siddiqui, M.F.H.; Javaid, A.Y. Facial emotion recognition: A survey and real-world user experiences in mixed reality. Sensors 2018, 18, 416. [Google Scholar] [CrossRef] [Green Version]
- Lei, G.; Li, X.H.; Zhou, J.L.; Gong, X.G. Geometric Feature Based Facial Expression Recognition Using Multiclass Support Vector Machines. In Proceedings of the 2009 IEEE International Conference on Granular Computing, Nanchang, China, 17–19 August 2009; pp. 318–321. [Google Scholar]
- Christou, N.; Kanojiya, N. Human Facial Expression Recognition with Convolution Neural Networks. In Third International Congress on Information and Communication Technology; Springer: Singapore, 2019; pp. 539–545. [Google Scholar]
- Slater, M.; Usoh, M. Body centred interaction in immersive virtual environments. Artif. Life Virtual Real. 1994, 1, 125–148. [Google Scholar]
- Lee, I.J. Kinect-for-windows with augmented reality in an interactive roleplay system for children with an autism spectrum disorder. Interact. Learn. Environ. 2021, 29, 688–704. [Google Scholar] [CrossRef]
- Hsiao, K.F.; Rashvand, H.F. Body Language and Augmented Reality Learning Environment. In Proceedings of the 2011 Fifth FTRA International Conference on multimedia and ubiquitous engineering, Crete, Greece, 28–30 June 2011; pp. 246–250. [Google Scholar]
- Umeda, T.; Correa, P.; Marques, F.; Marichal, X. A Real-Time Body Analysis for Mixed Reality Application. In Proceedings of the Korea-Japan Joint Workshop on Frontiers of Computer Vision, FCV-2004, Fukuoka, Japan, 4–6 February 2014. [Google Scholar]
- Yousefi, S.; Kidane, M.; Delgado, Y.; Chana, J.; Reski, N. 3D Gesture-Based Interaction for Immersive Experience in Mobile VR. In Proceedings of the 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 4–8 December 2016; pp. 2121–2126. [Google Scholar]
- Weichel, C.; Lau, M.; Kim, D.; Villar, N.; Gellersen, H.W. MixFab: A Mixed-Reality Environment for Personal Fabrication. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 3855–3864. [Google Scholar]
- Yang, M.T.; Liao, W.C. Computer-assisted culture learning in an online augmented reality environment based on free-hand gesture interaction. IEEE Trans. Learn. Technol. 2014, 7, 107–117. [Google Scholar] [CrossRef]
- Porter, S.R.; Marner, M.R.; Smith, R.T.; Zucco, J.E.; Thomas, B.H. Validating Spatial Augmented Reality for Interactive Rapid Prototyping. In Proceedings of the 2010 IEEE International Symposium on Mixed and Augmented Reality, Seoul, Korea, 13–16 October 2010. [Google Scholar]
- Fadzli, F.E.; Ismail, A.W.; Aladin, M.Y.F.; Othman, N.Z.S. A Review of Mixed Reality Telepresence. In IOP Conference Series: Materials Science and Engineering, 2nd ed.; Springer: New York, NY, USA, 2021; Volume 864, p. 012081. [Google Scholar]
- Moares, R.; Jadhav, V.; Bagul, R.; Jacbo, R.; Rajguru, S. Inter AR: Interior Decor App Using Augmented Reality Technology. In Proceedings of the 5th International Conference on Cyber Security & Privacy in Communication Networks (ICCS), Kurukshetra, Haryana, India, 29–30 November 2019. [Google Scholar]
- Polvi, J.; Taketomi, T.; Yamamoto, G.; Dey, A.; Sandor, C.; Kato, H. SlidAR: A 3D positioning method for SLAM-based handheld augmented reality. Comput. Graph. 2016, 55, 33–43. [Google Scholar] [CrossRef] [Green Version]
- Sandu, M.; Scarlat, I.S. Augmented Reality Uses in Interior Design. Inform. Econ. 2018, 22, 5–13. [Google Scholar] [CrossRef]
- Labrie, A.; Cheng, J. Adapting Usability Heuristics to the Context of Mobile Augmented Reality. In Proceedings of the Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual Event, USA, 20–23 October 2020. [Google Scholar]
- Ong, S.K.; Wang, Z.B. Augmented assembly technologies based on 3D bare-hand interaction. CIRP Ann. 2011, 60, 1–4. [Google Scholar] [CrossRef]
- Mitasova, H.; Mitas, L.; Ratti, C.; Ishii, H.; Alonso, J.; Harmon, R.S. Real-time landscape model interaction using a tangible geospatial modeling environment. IEEE Comput. Graph. Appl. 2006, 26, 55–63. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Punpongsanon, P.; Iwai, D.; Sato, K. Softar: Visually manipulating haptic softness perception in spatial augmented reality. IEEE Trans. Vis. Comput. Graph. 2015, 21, 1279–1288. [Google Scholar] [CrossRef]
- Barreira, J.; Bessa, M.; Barbosa, L.; Magalhães, L. A context-aware method for authentically simulating outdoors shadows for mobile augmented reality. IEEE Trans. Vis. Comput. Graph. 2017, 24, 1223–1231. [Google Scholar] [CrossRef] [Green Version]
- Lensing, P.; Broll, W. Instant Indirect Illumination for Dynamic Mixed Reality Scenes. In Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Atlanta, GA, USA, 5–8 November 2012. [Google Scholar]
- Fiala, M. ARTag, a Fiducial Marker System Using Digital Techniques. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05); IEEE: Manhattan, NY, USA, June 2005; Volume 2, pp. 590–596. [Google Scholar]
- Onime, C.; Uhomoibhi, J.; Wang, H.; Santachiara, M. A reclassification of markers for mixed reality environments. Int. J. Inf. Learn. Technol. 2020, 38, 161–173. [Google Scholar] [CrossRef]
- Fiala, M. Designing highly reliable fiducial markers. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 32, 1317–1324. [Google Scholar] [CrossRef]
- Naimark, L.; Foxlin, E. Circular Data Matrix Fiducial System And Robust Image Processing For A Wearable Vision-Inertial Self-tracker. In Proceedings of the International Symposium on Mixed and Augmented Reality; IEEE: Manhattan, NY, USA, 2002; pp. 27–36. [Google Scholar]
- Bencina, R.; Kaltenbrunner, M. The Design and Evolution of Fiducials for the Reactivision System. In Proceedings of the Third International Conference on Generative Systems in the Electronic Arts, 1st ed.; Monash University Publishing: Melbourne, VIC, Australia, 2005. [Google Scholar]
- Rekimoto, J.; Ayatsuka, Y. CyberCode: Designing Augmented Reality Environments with Visual Tags. In Proceedings of DARE 2000 on Designing Augmented Reality Environments; AMC: New York, NY, USA, 2000; pp. 1–10. [Google Scholar]
- Rohs, M. Visual Code Widgets for Marker-Based Interaction. In Proceedings of the 25th IEEE International Conference on Distributed Computing Systems Workshops; IEEE: Manhattan, NY, USA, 2005; pp. 506–513. [Google Scholar]
- Boulanger, P. Application of Augmented Reality to Industrial Tele-Training. In Proceedings of the First Canadian Conference on Computer and Robot Vision; IEEE: Manhattan, NY, USA, 2004; pp. 320–328. [Google Scholar]
- Pentenrieder, K.; Meier, P.; Klinker, G. Analysis of Tracking Accuracy for Single-Camera Square-Marker-Based Tracking. In Proceedings of the Dritter Workshop Virtuelle und Erweiterte Realitt der GIFachgruppe VR/AR; Citeseer: Koblenz, Germany, 2006. [Google Scholar]
- Flohr, D.; Fischer, J. A Lightweight ID-Based Extension for Marker Tracking Systems. 2007. Available online: https://www.researchgate.net/publication/228541592_A_Lightweight_ID-Based_Extension_for_Marker_Tracking_Systems (accessed on 10 September 2021).
- Mateos, L.A. AprilTags 3D: Dynamic Fiducial Markers for Robust Pose Estimation in Highly Reflective Environments and Indirect Communication in Swarm Robotics. arXiv 2020, arXiv:2001.08622. [Google Scholar]
- Wang, T.; Liu, Y.; Wang, Y. Infrared Marker Based Augmented Reality System for Equipment Maintenance. In Proceedings of the 2008 International Conference on Computer Science and Software Engineering; IEEE: Manhattan, NY, USA, December 2008; Volume 5, pp. 816–819. [Google Scholar]
- Nee, A.Y.; Ong, S.K.; Chryssolouris, G.; Mourtzis, D. Augmented reality applications in design and manufacturing. CIRP Ann. 2012, 61, 657–679. [Google Scholar] [CrossRef]
- Evangelidis, K.; Sylaiou, S.; Papadopoulos, T. Mergin’mode: Mixed reality and geoinformatics for monument demonstration. Appl. Sci. 2020, 10, 3826. [Google Scholar] [CrossRef]
- Stricker, D.; Karigiannis, J.; Christou, I.T.; Gleue, T.; Ioannidis, N. Augmented Reality for Visitors of Cultural Heritage Sites. In Proceedings of the Int. Conf. on Cultural and Scientific Aspects of Experimental Media Spaces, Bonn, Germany, 21–22 September 2001; pp. 89–93. [Google Scholar]
- Shen, R.; Terada, T.; Tsukamoto, M. A system for visualizing sound source using augmented reality. Int. J. Pervasive Comput. Commun. 2013, 9, 227–242. [Google Scholar] [CrossRef]
- Rajguru, C.; Obrist, M.; Memoli, G. Spatial soundscapes and virtual worlds: Challenges and opportunities. Front. Psychol. 2020, 11, 2714. [Google Scholar] [CrossRef] [PubMed]
- Härmä, A.; Jakka, J.; Tikander, M.; Karjalainen, M.; Lokki, T.; Hiipakka, J.; Lorho, G. Augmented reality audio for mobile and wearable appliances. J. Audio Eng. Soc. 2004, 52, 618–639. [Google Scholar]
- Nordahl, R.; Turchet, L.; Serafin, S. Sound synthesis and evaluation of interactive footsteps and environmental sounds rendering for virtual reality applications. IEEE Trans. Vis. Comput. Graph. 2011, 17, 1234–1244. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Verron, C.; Aramaki, M.; Kronland-Martinet, R.; Pallone, G. A 3-D immersive synthesizer for environmental sounds. IEEE Trans. Audio Speech Lang. Process. 2009, 18, 1550–1561. [Google Scholar] [CrossRef]
- Gaver, W.W. What in the world do we hear?: An ecological approach to auditory event perception. Ecol. Psychol. 1993, 5, 1–29. [Google Scholar] [CrossRef]
- Che Hashim, N.; Abd Majid, N.A.; Arshad, H.; Khalid Obeidy, W. User satisfaction for an augmented reality application to support productive vocabulary using speech recognition. Adv. Multimed. 2018, 2018, 9753979. [Google Scholar] [CrossRef]
- Billinghurst, M.; Kato, H.; Myojin, S. Advanced Interaction Techniques for Augmented Reality Applications. In International Conference on Virtual and Mixed Reality; Springer: Berlin/Heidelberg, Germany, 2009; pp. 13–22. [Google Scholar]
- Kato, H.; Billinghurst, M.; Poupyrev, I.; Imamoto, K.; Tachibana, K. Virtual Object Manipulation on a Table-Top AR Environment. In Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000); IEEE: Manhattan, NY, USA, 2000; pp. 111–119. [Google Scholar]
- Denecke. Ariadne Spoken Dialogue System. Available online: tinyurl.com/3wjzerds (accessed on 28 July 2021).
- Microsoft Speech SDK (SAPI 5.0). Available online: tinyurl.com/58mf8bz5 (accessed on 28 July 2021).
- Hanifa, R.M.; Isa, K.; Mohamad, S. A review on speaker recognition: Technology and challenges. Comput. Electr. Eng. 2021, 90, 107005. [Google Scholar] [CrossRef]
- Chollet, G.; Esposito, A.; Gentes, A.; Horain, P.; Karam, W.; Li, Z.; Zouari, L. Multimodal Human Machine Interactions in Virtual and Augmented Reality (v-dij-14); Springer: New York, NY, USA, 2008. [Google Scholar]
- Lin, T.; Huang, L.; Hannaford, B.; Tran, C.; Raiti, J.; Zaragoza, R.; James, J. Empathics system: Application of Emotion Analysis AI through Smart Glasses. In Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments; AMC: New York, NY, USA, June 2020; pp. 1–4. [Google Scholar]
- Mirzaei, M.R.; Ghorshi, S.; Mortazavi, M. Combining Augmented Reality and Speech Technologies to Help Deaf and Hard of Hearing People. In Proceedings of the 2012 14th Symposium on Virtual and Augmented Reality; IEEE Computer Society: Washington, DC, USA, May 2012; pp. 174–181. [Google Scholar]
- Altosaar, R.; Tindale, A.; Doyle, J. Physically Colliding with Music: Full-body Interactions with an Audio-only Virtual Reality Interface. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction; AMC: New York, NY, USA, March 2019; pp. 553–557. [Google Scholar]
- Bauer, V.; Bouchara, T. First Steps Towards Augmented Reality Interactive Electronic Music Production. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW); IEEE Computer Society: Washington, DC, USA, March 2021; pp. 90–93. [Google Scholar]
- Bederson, B.B. Audio Augmented Reality: A Prototype Automated Tour Guide. In Proceedings of the Conference companion on Human factors in computing systems; AMC: New York, NY, USA, May 1995; pp. 210–211. [Google Scholar]
- Paterson, N.; Naliuka, K.; Jensen, S.K.; Carrigy, T.; Haahr, M.; Conway, F. Design, Implementation and Evaluation of Audio for a Location Aware Augmented Reality Game. In Proceedings of the 3rd International Conference on Fun and Games; AMC: New York, NY, USA, September 2021; pp. 149–156. [Google Scholar]
- Lyons, K.; Gandy, M.; Starner, T. Guided by voices: An audio augmented reality system. In Proceedings of the International Conference of Auditory Display (ICAD); Georgia Institute of Technology: Atlanta, GA, USA, 2000; pp. 57–62. [Google Scholar]
- Blum, J.R.; Bouchard, M.; Cooperstock, J.R. What’s around Me? Spatialized Audio Augmented Reality for Blind Users with a Smartphone. In International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services; Springer: Berlin/Heidelberg, Germany, 2011; pp. 49–62. [Google Scholar]
- Yang, Y.; Shim, J.; Chae, S.; Han, T.D. Mobile Augmented Reality Authoring Tool. In Proceedings of the 2016 IEEE Tenth International Conference on Semantic Computing (ICSC); IEEE: Manhattan, NY, USA, February 2016; pp. 358–361. [Google Scholar]
- Jung, J.; Hong, J.; Park, S.; Yang, H.S. Smartphone as an Augmented Reality Authoring Tool via Multi-Touch Based 3D Interaction Method. In Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry; IEEE: Manhattan, NY, USA, December 2012; pp. 17–20. [Google Scholar]
- Kasahara, S.; Niiyama, R.; Heun, V.; Ishii, H. exTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, Barcelona, Spain, 10–13 February 2013; pp. 223–228. [Google Scholar]
- Yannier, N.; Koedinger, K.R.; Hudson, S.E. Learning from Mixed-Reality Games: Is Shaking a Tablet as Effective as Physical Observation? In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems; AMC: New York, NY, USA, April 2015; pp. 1045–1054. [Google Scholar]
- Xiao, R.; Schwarz, J.; Throm, N.; Wilson, A.D.; Benko, H. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1653–1660. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Kienzle, W.; Ma, Y.; Ng, S.S.; Benko, H.; Harrison, C. ActiTouch: Robust Touch Detection for On-Skin AR/VR Interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology; AMC: New York, NY, USA, October 2019; pp. 1151–1159. [Google Scholar]
- Jiawei, W.; Li, Y.; Tao, L.; Yuan, Y. Three-Dimensional Interactive pen Based on Augmented Reality. In Proceedings of the 2010 International Conference on Image Analysis and Signal Processing, Povoa de Varzim, Portugal, 21–23 June 2010. [Google Scholar]
- Yue, Y.T.; Zhang, X.; Yang, Y.; Ren, G.; Choi, Y.K.; Wang, W. Wiredraw: 3D Wire Sculpturing Guided with MIXED reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems; AMC: New York, NY, USA, May 2017; pp. 3693–3704. [Google Scholar]
- Yun, K.; Woo, W. Tech-Note: Spatial Interaction Using Depth Camera for Miniature AR. In Proceedings of the 2009 IEEE Symposium on 3D User Interfaces; IEEE Computer Society: Washington, DC, USA, March 2009; pp. 119–122. [Google Scholar]
- Back, M.; Cohen, J.; Gold, R.; Harrison, S.; Minneman, S. Listen reader: An Electronically Augmented Paper-Based Book. In Proceedings of the SIGCHI conference on Human factors in computing systems; AMC: New York, NY, USA, March 2001; pp. 23–29. [Google Scholar]
- Leitner, J.; Haller, M.; Yun, K.; Woo, W.; Sugimoto, M.; Inami, M. IncreTable, a Mixed Reality Tabletop Game Experience. In Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology; AMC: New York, NY, USA, December 2008; pp. 9–16. [Google Scholar]
- Hashimoto, S.; Ishida, A.; Inami, M.; Igarashi, T. Touchme: An Augmented Reality Based Remote Robot Manipulation. In Proceedings of the 21st International Conference on Artificial Reality and Telexistence; AMC: New York, NY, USA, November 2011; Volume 2. [Google Scholar]
- Chuah, J.H.; Lok, B.; Black, E. Applying mixed reality to simulate vulnerable populations for practicing clinical communication skills. IEEE Trans. Vis. Comput. Graph. 2013, 19, 539–546. [Google Scholar] [CrossRef]
- Jiang, H.; Weng, D.; Zhang, Z.; Bao, Y.; Jia, Y.; Nie, M. Hikeyb: High-Efficiency Mixed Reality System for Text Entry. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct); IEEE: Manhattan, NY, USA, October 2018; pp. 132–137. [Google Scholar]
- Leydon, K. Sensing the Position and Orientation of Hand-Held Objects: An Overview of Techniques. 2001. Available online: http://ulsites.ul.ie/csis/sites/default/files/csis_sensing_the_position_and_orientation_of_hand-held_objects.pdf (accessed on 10 September 2021).
- Poupyrev, I.; Harrison, C.; Sato, M. Touché: Touch and gesture Sensing for the Real World. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing; AMC: New York, NY, USA, September 2012; p. 536. [Google Scholar]
- Kienzle, W.; Whitmire, E.; Rittaler, C.; Benko, H. ElectroRing: Subtle Pinch and Touch Detection with a Ring. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; AMC: New York, NY, USA, May 2021; pp. 1–12. [Google Scholar]
- Han, T.; Anderson, F.; Irani, P.; Grossman, T. Hydroring: Supporting Mixed Reality Haptics Using Liquid Flow. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology; AMC: New York, NY, USA, October 2018; pp. 913–925. [Google Scholar]
- Bai, H.; Li, S.; Shepherd, R.F. Elastomeric Haptic Devices for Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 2009364. Available online: https://www.researchgate.net/profile/Shuo-Li-38/publication/352066240_Elastomeric_Haptic_Devices_for_Virtual_and_Augmented_Reality/links/60d1f3f345851566d5809357/Elastomeric-Haptic-Devices-for-Virtual-and-Augmented-Reality.pdf (accessed on 10 September 2021). [CrossRef]
- Talasaz, A.; Trejos, A.L.; Patel, R.V. Effect of Force Feedback on Performance of Robotics-Assisted Suturing. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob); IEEE: Manhattan, NY, USA, June 2012; pp. 823–828. [Google Scholar]
- Akinbiyi, T.; Reiley, C.E.; Saha, S.; Burschka, D.; Hasser, C.J.; Yuh, D.D.; Okamura, A.M. Dynamic Augmented Reality for Sensory Substitution in Robot-Assisted Surgical Systems. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society; IEEE: Manhattan, NY, USA, September 2006; pp. 567–570. [Google Scholar]
- Günther, S.; Schön, D.; Müller, F.; Mühlhäuser, M.; Schmitz, M. PneumoVolley: Pressure-Based Haptic Feedback on the Head through Pneumatic Actuation. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems; AMC: New York, NY, USA, April 2020; pp. 1–10. [Google Scholar]
- Schorr, S.B.; Okamura, A.M. Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE Trans. Haptics 2017, 10, 418–430. [Google Scholar] [CrossRef]
- Meli, L.; Pacchierotti, C.; Salvietti, G.; Chinello, F.; Maisto, M.; De Luca, A.; Prattichizzo, D. Combining wearable finger haptics and augmented reality: User evaluation using an external camera and the microsoft hololens. IEEE Robot. Autom. Lett. 2018, 3, 4297–4304. [Google Scholar] [CrossRef] [Green Version]
- Yang, T.H.; Kim, J.R.; Jin, H.; Gil, H.; Koo, J.H.; Kim, H.J. Recent Advances and Opportunities of Active Materials for Haptic Technologies in Virtual and Augmented Reality. Adv. Funct. Mater. 2021, 2008831. Available online: https://onlinelibrary.wiley.com/doi/full/10.1002/adfm.202008831 (accessed on 10 September 2021). [CrossRef]
- Kim, T.; Cooperstock, J.R. Enhanced Pressure-Based Multimodal Immersive Experiences. In Proceedings of the 9th Augmented Human International Conference; AMC: New York, NY, USA, February 2018; pp. 1–3. [Google Scholar]
- Qian, G.; Peng, B.; Zhang, J. Gesture Recognition Using Video and Floor Pressure Data. In Proceedings of the 2012 19th IEEE International Conference on Image Processing; AMC: New York, NY, USA, September 2021; pp. 173–176. [Google Scholar]
- Minh, V.T.; Katushin, N.; Pumwa, J. Motion tracking glove for augmented reality and virtual reality. Paladyn J. Behav. Robot. 2019, 10, 160–166. [Google Scholar] [CrossRef]
- Zhu, M.; Sun, Z.; Zhang, Z.; Shi, Q.; Chen, T.; Liu, H.; Lee, C. Sensory-Glove-Based Human Machine Interface for Augmented Reality (AR) Applications. In Proceedings of the 2020 IEEE 33rd International Conference on Micro Electromechanical Systems (MEMS), Vancouver, BC, Canada, 18–22 January 2020; pp. 16–19. [Google Scholar]
- Nguyen, T.V.; Kamma, S.; Adari, V.; Lesthaeghe, T.; Boehnlein, T.; Kramb, V. Mixed reality system for nondestructive evaluation training. Virtual Real. 2020, 25, 709–718. [Google Scholar] [CrossRef]
- Gül, S.; Bosse, S.; Podborski, D.; Schierl, T.; Hellge, C. Kalman Filter-based Head Motion Prediction for Cloud-based Mixed Reality. In Proceedings of the 28th ACM International Conference on Multimedia; AMC: New York, NY, USA, October 2020; pp. 3632–3641. [Google Scholar]
- Brena, R.F.; García-Vázquez, J.P.; Galván-Tejada, C.E.; Muñoz-Rodriguez, D.; Vargas-Rosales, C.; Fangmeyer, J. Evolution of indoor positioning technologies: A survey. J. Sens. 2017, 2017, 2630413. [Google Scholar] [CrossRef]
- Benyon, D.; Quigley, A.; O’keefe, B.; Riva, G. Presence and digital tourism. AI Soc. 2014, 29, 521–529. [Google Scholar] [CrossRef]
- Schrier, K. Using Augmented Reality Games to Teach 21st Century Skills. In ACM SIGGRAPH 2006 Educators Program; AMC: New York, NY, USA, 2006; p. 15-es. [Google Scholar]
- Sakpere, W.; Adeyeye-Oshin, M.; Mlitwa, N.B. A state-of-the-art survey of indoor positioning and navigation systems and technologies. S. Afr. Comput. J. 2017, 29, 145–197. [Google Scholar] [CrossRef] [Green Version]
- Jiang, W.; Li, Y.; Rizos, C.; Cai, B.; Shangguan, W. Seamless indoor-outdoor navigation based on GNSS, INS and terrestrial ranging techniques. J. Navig. 2017, 70, 1183–1204. [Google Scholar] [CrossRef] [Green Version]
- Zhuang, Y.; Hua, L.; Qi, L.; Yang, J.; Cao, P.; Cao, Y.; Haas, H. A survey of positioning systems using visible LED lights. IEEE Commun. Surv. Tutor. 2018, 20, 1963–1988. [Google Scholar] [CrossRef] [Green Version]
- Afzalan, M.; Jazizadeh, F. Indoor positioning based on visible light communication: A performance-based survey of real-world prototypes. ACM Comput. Surv. (CSUR) 2019, 52, 1–36. [Google Scholar] [CrossRef]
- Madakam, S.; Lake, V.; Lake, V.; Lake, V. Internet of Things (IoT): A literature review. J. Comput. Commun. 2015, 3, 164. [Google Scholar] [CrossRef] [Green Version]
- Atsali, G.; Panagiotakis, S.; Markakis, E.; Mastorakis, G.; Mavromoustakis, C.X.; Pallis, E.; Malamos, A. A mixed reality 3D system for the integration of X3DoM graphics with real-time IoT data. Multimed. Tools Appl. 2018, 77, 4731–4752. [Google Scholar] [CrossRef]
- Natephra, W.; Motamedi, A. Live Data Visualization of IoT Sensors Using Augmented Reality (AR) and BIM. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC 2019), Banff, AB, Canada, 21–24 May 2019; pp. 21–24. [Google Scholar]
- Phupattanasilp, P.; Tong, S.R. Augmented reality in the integrative internet of things (AR-IoT): Application for precision farming. Sustainability 2019, 11, 2658. [Google Scholar] [CrossRef] [Green Version]
- Hoppenstedt, B.; Schmid, M.; Kammerer, K.; Scholta, J.; Reichert, M.; Pryss, R. Analysis of Fuel Cells Utilizing Mixed Reality and IoT Achievements. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics; Springer: New York, NY, USA, 2019; pp. 371–378. [Google Scholar]
- Yasmin, R.; Salminen, M.; Gilman, E.; Petäjäjärvi, J.; Mikhaylov, K.; Pakanen, M.; Pouttu, A. Combining IoT Deployment and Data Visualization: Experiences within campus Maintenance USE-case. In Proceedings of the 2018 9th International Conference on the Network of the Future (NOF); IEEE: Manhattan, NY, USA, November 2018; pp. 101–105. [Google Scholar]
- Pokric, B.; Krco, S.; Drajic, D.; Pokric, M.; Rajs, V.; Mihajlovic, Z.; Jovanovic, D. Augmented Reality Enabled IoT Services for Environmental Monitoring Utilising Serious Gaming Concept. J. Wirel. Mob. Netw. Ubiquitous Comput. Dependable Appl. 2015, 6, 37–55. [Google Scholar]
- Morris, A.; Guan, J.; Lessio, N.; Shao, Y. Toward Mixed Reality Hybrid Objects with IoT Avatar Agents. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 766–773. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Papadopoulos, T.; Evangelidis, K.; Kaskalis, T.H.; Evangelidis, G.; Sylaiou, S. Interactions in Augmented and Mixed Reality: An Overview. Appl. Sci. 2021, 11, 8752. https://doi.org/10.3390/app11188752
Papadopoulos T, Evangelidis K, Kaskalis TH, Evangelidis G, Sylaiou S. Interactions in Augmented and Mixed Reality: An Overview. Applied Sciences. 2021; 11(18):8752. https://doi.org/10.3390/app11188752
Chicago/Turabian StylePapadopoulos, Theofilos, Konstantinos Evangelidis, Theodore H. Kaskalis, Georgios Evangelidis, and Stella Sylaiou. 2021. "Interactions in Augmented and Mixed Reality: An Overview" Applied Sciences 11, no. 18: 8752. https://doi.org/10.3390/app11188752
APA StylePapadopoulos, T., Evangelidis, K., Kaskalis, T. H., Evangelidis, G., & Sylaiou, S. (2021). Interactions in Augmented and Mixed Reality: An Overview. Applied Sciences, 11(18), 8752. https://doi.org/10.3390/app11188752