Topic Editors

Department of Management and Production Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin, Italy
DPIA Department, University of Udine, Udine, Italy
Department of Civil and Mechanical Engineering, University of Cassino and Lazio Meridionale, Cassino, Italy
Department of Management and Production Engineering, Polytechnic of Turin, 10129 Turin, Italy

Extended Reality (XR): AR, VR, MR and Beyond

Abstract submission deadline
closed (31 December 2022)
Manuscript submission deadline
closed (28 February 2023)
Viewed by
66404

Topic Information

Dear Colleagues,

In recent years, virtual and augmented reality usage scenarios have significantly changed, moving from a specialized usage for research purpose only to a wider employment for a huge number of market applications and a significant number of users. This new situation has stimulated the growth of an impressive number of solutions in different domains, from military to entertainment.

Technological improvements have been registered, especially in smartphone devices, such as in camera performance and features, which have allowed the possibility of creating a tangible interaction between the real and the virtual domain that could be perceived as a “window on the real world”. For this reason, some important brands have significantly involved virtual and augmented reality in their efforts to support their commercial strategies, focusing the attention on “mobile augmented reality” (MAR).

Moreover, AR/VR flexibility could support its sinergic usage with other technologies representing an “integration platform” for technology empowerment, for instance, in machine and deeep learing (AI), which particularly in the computer vision domain may be able to provide very disruptive solutions, e.g., in the medical domain for disease diagnosis and for precise surgey assistance. Minimal invasive surgical (MIS) technologies are another viable example where the combination bewteen real-world data, live camera images in real-time, and virtual objects, artificially generated three-dimensional medical images coming from trans rectal ultrasound or computed tomography/MRI, could provide a tangible support to medical tasks. Moreover, due to the present COVID-19 situation that will force a significant percentage of the workforce to work from home multiple days a week by the end of 2021, it will be necessary to provide a change of perspective in many collaborative tasks, such as conference calls, which are often undermined by the lack of a direct personal presence, for instance, by exploiting AR/VR technologies to create a more socially conducive environment where everybody can improve “social interaction”. Thus, given the current surge of AR/VR applications and the new scenario provided by the pandemic situation due to COVID-19, it is important to synthesize the current knowledge about the interactions and impacts between society and AR/VR latests and novel applications.

The aim of this Topic is to advance scholarly understanding of how these technologies can be used theoretically, empirically, methodologically, and practically, to understand how society and AR/VR interact with and impact each other, also in relation to COVID-19, which has completely modified many usage scenarios, and in which the possibility to exploit the power of “virtuality” in its different points of view is becoming fundamental to overcoming the limitations of human interaction and support.

Prof. Dr. Enrico Vezzetti
Dr. Barbara Motyl
Dr. Domenico Speranza
Dr. Luca Ulrich
Topic Editors

Keywords

  • human machine interaction
  • 3D modeling
  • human body modeling
  • augmented reality
  • virtual reality
  • product innovation
  • product design
  • augmented surgery
  • augmented medical diagnosis
  • augmented maintenance
  • augmented education
  • head mounted displays
  • artificial intelligence
  • computer vision
  • education
  • machine learning
  • deep learning
  • mixed reality
  • extended reality
  • xR

Participating Journals

Journal Name Impact Factor CiteScore Launched Year First Decision (median) APC
Applied Sciences
applsci
2.5 5.3 2011 17.8 Days CHF 2400
Journal of Imaging
jimaging
2.7 5.9 2015 20.9 Days CHF 1800
Sensors
sensors
3.4 7.3 2001 16.8 Days CHF 2600
Electronics
electronics
2.6 5.3 2012 16.8 Days CHF 2400
Diagnostics
diagnostics
3.0 4.7 2011 20.5 Days CHF 2600

Preprints.org is a multidiscipline platform providing preprint service that is dedicated to sharing your research from the start and empowering your research journey.

MDPI Topics is cooperating with Preprints.org and has built a direct connection between MDPI journals and Preprints.org. Authors are encouraged to enjoy the benefits by posting a preprint at Preprints.org prior to publication:

  1. Immediately share your ideas ahead of publication and establish your research priority;
  2. Protect your idea from being stolen with this time-stamped preprint article;
  3. Enhance the exposure and impact of your research;
  4. Receive feedback from your peers in advance;
  5. Have it indexed in Web of Science (Preprint Citation Index), Google Scholar, Crossref, SHARE, PrePubMed, Scilit and Europe PMC.

Published Papers (19 papers)

Order results
Result details
Journals
Select all
Export citation of selected articles as:
17 pages, 20496 KiB  
Article
Design of Augmented Reality Training Content for Railway Vehicle Maintenance Focusing on the Axle-Mounted Disc Brake System
by Hwi-Jin Kwon, Seung-Il Lee, Ju-Hyung Park and Chul-Su Kim
Appl. Sci. 2021, 11(19), 9090; https://doi.org/10.3390/app11199090 - 29 Sep 2021
Cited by 7 | Viewed by 2757
Abstract
Light maintenance training for electric multiple-unit components of the organization of railway operations is generally conducted using maintenance manuals and work videos, following the guidelines of each organization. These manuals are in the form of booklets, complicated and inconvenient for maintenance operators to [...] Read more.
Light maintenance training for electric multiple-unit components of the organization of railway operations is generally conducted using maintenance manuals and work videos, following the guidelines of each organization. These manuals are in the form of booklets, complicated and inconvenient for maintenance operators to carry. Therefore, training content that visualizes maintenance procedures in a three-dimensions (3D) space is necessary to overcome the drawbacks of booklet-type training. In this study, we developed augmented reality (AR)-based training content for railway vehicle maintenance to increase training efficiency. Providing warning signs for risky procedures reduces human error, and transparency control makes trainees check the product hierarchy. A virtual experience based on the maintenance manual is provided to improve maintenance proficiency. An axle-mounted disc brake system maintenance manual is implemented in AR to reflect the requirements of maintenance operators. The convenience of this tool is improved by loading the AR content on a mobile device. Two methods of verification were used: the system usability scale (SUS) survey and training efficiency evaluation. The resulting SUS grade was B (excellent), and the training efficiency improved by 34%. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

11 pages, 1600 KiB  
Article
Evaluation of a Linear Measurement Tool in Virtual Reality for Assessment of Multimodality Imaging Data—A Phantom Study
by Natasha Stephenson, Kuberan Pushparajah, Gavin Wheeler, Shujie Deng, Julia A. Schnabel and John M. Simpson
J. Imaging 2022, 8(11), 304; https://doi.org/10.3390/jimaging8110304 - 8 Nov 2022
Viewed by 1654
Abstract
This study aimed to evaluate the accuracy and reliability of a virtual reality (VR) system line measurement tool using phantom data across three cardiac imaging modalities: three-dimensional echocardiography (3DE), computed tomography (CT) and magnetic resonance imaging (MRI). The same phantoms were also measured [...] Read more.
This study aimed to evaluate the accuracy and reliability of a virtual reality (VR) system line measurement tool using phantom data across three cardiac imaging modalities: three-dimensional echocardiography (3DE), computed tomography (CT) and magnetic resonance imaging (MRI). The same phantoms were also measured using industry-standard image visualisation software packages. Two participants performed blinded measurements on volume-rendered images of standard phantoms both in VR and on an industry-standard image visualisation platform. The intra- and interrater reliability of the VR measurement method was evaluated by intraclass correlation coefficient (ICC) and coefficient of variance (CV). Measurement accuracy was analysed using Bland–Altman and mean absolute percentage error (MAPE). VR measurements showed good intra- and interobserver reliability (ICC ≥ 0.99, p < 0.05; CV < 10%) across all imaging modalities. MAPE for VR measurements compared to ground truth were 1.6%, 1.6% and 7.7% in MRI, CT and 3DE datasets, respectively. Bland–Altman analysis demonstrated no systematic measurement bias in CT or MRI data in VR compared to ground truth. A small bias toward smaller measurements in 3DE data was seen in both VR (mean −0.52 mm [−0.16 to −0.88]) and the standard platform (mean −0.22 mm [−0.03 to −0.40]) when compared to ground truth. Limits of agreement for measurements across all modalities were similar in VR and standard software. This study has shown good measurement accuracy and reliability of VR in CT and MRI data with a higher MAPE for 3DE data. This may relate to the overall smaller measurement dimensions within the 3DE phantom. Further evaluation is required of all modalities for assessment of measurements <10 mm. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

14 pages, 5373 KiB  
Article
Evaluation of the SteamVR Motion Tracking System with a Custom-Made Tracker
by Marcin Maciejewski
Appl. Sci. 2021, 11(14), 6390; https://doi.org/10.3390/app11146390 - 10 Jul 2021
Cited by 2 | Viewed by 2469
Abstract
The paper presents the research of the SteamVR tracker developed for a man-portable air-defence training system. The tests were carried out in laboratory conditions, with the tracker placed on the launcher model along with elements ensuring the faithful reproduction of operational conditions. During [...] Read more.
The paper presents the research of the SteamVR tracker developed for a man-portable air-defence training system. The tests were carried out in laboratory conditions, with the tracker placed on the launcher model along with elements ensuring the faithful reproduction of operational conditions. During the measurements, the static tracker was moved and rotated in a working area. The range of translations and rotations corresponded to the typical requirements of a shooting simulator application. The results containing the registered position and orientation values were plotted on 3D charts which showed the tracker’s operation. Further analyses determined the values of the systematic and random errors for measurements of the SteamVR system operating with a custom-made tracker. The obtained results with random errors of 0.15 mm and 0.008° for position and orientation, respectively, proved the high precision of the measurements. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

15 pages, 12271 KiB  
Article
VR-Based Job Training System Using Tangible Interactions
by Seongmin Baek, Youn-Hee Gil and Yejin Kim
Sensors 2021, 21(20), 6794; https://doi.org/10.3390/s21206794 - 13 Oct 2021
Cited by 6 | Viewed by 2174
Abstract
Virtual training systems are in an increasing demand because of real-world training, which requires a high cost or accompanying risk, and can be conducted safely through virtual environments. For virtual training to be effective for users, it is important to provide realistic training [...] Read more.
Virtual training systems are in an increasing demand because of real-world training, which requires a high cost or accompanying risk, and can be conducted safely through virtual environments. For virtual training to be effective for users, it is important to provide realistic training situations; however, virtual reality (VR) content using VR controllers for experiential learning differ significantly from real content in terms of tangible interactions. In this paper, we propose a method for enhancing the presence and immersion during virtual training by applying various sensors to tangible virtual training as a way to track the movement of real tools used during training and virtualizing the entire body of the actual user for transfer to a virtual environment. The proposed training system connects virtual and real-world spaces through an actual object (e.g., an automobile) to provide the feeling of actual touch during virtual training. Furthermore, the system measures the posture of the tools (steam gun and mop) and the degree of touch and applies them during training (e.g., a steam car wash.) User-testing is conducted to validate the increase in the effectiveness of virtual job training. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

14 pages, 1897 KiB  
Article
Vibrating Tilt Platform Enhancing Immersive Experience in VR
by Grzegorz Zwoliński, Dorota Kamińska, Anna Laska-Leśniewicz and Łukasz Adamek
Electronics 2022, 11(3), 462; https://doi.org/10.3390/electronics11030462 - 4 Feb 2022
Cited by 3 | Viewed by 2673
Abstract
One of the disadvantages of virtual reality systems of the past was the fact that they had visual-only interfaces. However, with the development of haptic technology, peripheral solutions to enhance the VR experience are gaining momentum, and many haptic systems are being developed [...] Read more.
One of the disadvantages of virtual reality systems of the past was the fact that they had visual-only interfaces. However, with the development of haptic technology, peripheral solutions to enhance the VR experience are gaining momentum, and many haptic systems are being developed for deepening VR immersion. This paper deals with the concept of a vibrating tilt platform allowing the change of three angles of inclination to thus reinforce the VR experience. The proposed system is flexible and adaptable to different sports, health, and education applications. In this paper, we present the mechanical and mechatronics aspects of the platform and usability testing results based on an immersive geological experience application. The first tests were studied in terms of the cyber sickness, perceived realism, and anxiety through both subjective (questionnaires) and objective (electroencephalogram) measurements. The results indicate that our platform increased anxiety levels and was perceived as realistic. Adding vibrations and tilting had a considerable impact on the immersion level and brain activity. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

16 pages, 20989 KiB  
Article
EXIT 360°—EXecutive-Functions Innovative Tool 360°—A Simple and Effective Way to Study Executive Functions in Parkinson’s Disease by Using 360° Videos
by Francesca Borgnis, Francesca Baglio, Elisa Pedroli, Federica Rossetto, Mario Meloni, Giuseppe Riva and Pietro Cipresso
Appl. Sci. 2021, 11(15), 6791; https://doi.org/10.3390/app11156791 - 23 Jul 2021
Cited by 6 | Viewed by 2680
Abstract
Executive dysfunction represents a common non-motor symptom in Parkinson’s disease (PD), with a substantial negative impact on daily functioning and quality of life. Assessing executive functions (EFs) with ecological tools is therefore essential. The ecological limitations of traditional neuropsychological tests have led to [...] Read more.
Executive dysfunction represents a common non-motor symptom in Parkinson’s disease (PD), with a substantial negative impact on daily functioning and quality of life. Assessing executive functions (EFs) with ecological tools is therefore essential. The ecological limitations of traditional neuropsychological tests have led to increased use of virtual reality and 360° environment-based tools for the assessment of EFs in real life. The study aims to evaluate the efficacy and usability of the EXecutive-Functions Innovative Tool 360° (EXIT 360°), a 360°-based tool for the evaluation of EFs in PD. Twenty-five individuals with PD and 25 healthy controls (HC) will be assessed with a conventional neuropsychological battery and EXIT 360° delivered via a head-mounted display. EXIT 360° will show a domestic scenario and seven different subtasks of increasing complexity, and will collect verbal responses, reaction times, and physiological data. We expect that EXIT 360° will be judged usable, engaging, and challenging. Moreover, we expect to find a highly convergent (conventional test and EXIT 360°) and diagnostic validity (individuals with PD vs. HC). The validation of EXIT 360° will allow for the adoption of a fast, ecological, and useful instrument for PD screening, likely transforming the assessment for the clinic and the patient. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

22 pages, 10809 KiB  
Article
Virtual Training System for the Teaching-Learning Process in the Area of Industrial Robotics
by Jordan S. Ipiales, Edison J. Araque, Víctor H. Andaluz and César A. Naranjo
Electronics 2023, 12(4), 974; https://doi.org/10.3390/electronics12040974 - 15 Feb 2023
Cited by 6 | Viewed by 1860
Abstract
This paper focuses on the development of a virtual training system by applying simulation techniques such as: Full Simulation and Hardware-in-the-Loop (HIL). This virtual reality system is intended to be a teaching and learning tool focused on the area of industrial robotics. For [...] Read more.
This paper focuses on the development of a virtual training system by applying simulation techniques such as: Full Simulation and Hardware-in-the-Loop (HIL). This virtual reality system is intended to be a teaching and learning tool focused on the area of industrial robotics. For this purpose, mathematical models (kinematic and dynamic) have been considered. These models determine the characteristics and restrictions of the movements of a Scara SR-800 robot. The robot is then virtualized to simulate position and trajectory tasks within virtual environments. The Unity 3D graphic engine (Unity Software Inc., San Francisco, CA, USA), allows the design and development of the training system which is composed of a laboratory environment and an industrial environment. The same that contribute to the visualization and evaluation of the movements of the robot through the proposed control algorithm using the mathematical software (MatLab, manufactured by MathWorks, USA), through shared memories. This software in turn can be linked to an electronic board (Raspberry Pi) for data acquisition through a wireless connection. Finally, the stability and robustness of the implemented controllers are analyzed, as well as the correct operation of the virtual training system. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

12 pages, 4257 KiB  
Article
Affordable Projector-Based AR Experimental Platform with Fitting Simulation Asset for Exploring Thermal Management
by Xingming Long, Yujie Chen and Jing Zhou
Appl. Sci. 2022, 12(16), 8019; https://doi.org/10.3390/app12168019 - 10 Aug 2022
Viewed by 1611
Abstract
Augmented reality (AR) applied in education provides learners a possible way for better understanding and thorough learning. Although the traditional projector is used to integrate the augmented information with real objects without wearing AR glasses, the projector-based AR system is unlikely to be [...] Read more.
Augmented reality (AR) applied in education provides learners a possible way for better understanding and thorough learning. Although the traditional projector is used to integrate the augmented information with real objects without wearing AR glasses, the projector-based AR system is unlikely to be adopted widely in education due to the cost, heavy weight, and space issues. In this paper, an alternative projector-camera AR platform, utilizing a digital light processing (DLP) module matched with a Beaglebone Black (BB) controller, is proposed for AR physical experiments. After describing the DLP-based AR learning design method, the algorithm of pre-deforming projection content with simulation-based poly fitting is presented to keep the virtual asset consistent with the user action; and then a prototype with the content regarding the thermal management of power devices is illustrated to validate the performance of the AR experimental platform. The result shows that the DLP-based AR platform is an accurate and interactive AR system with a response time of 1 s, and a registration deviation of 3 mm. It is also an affordable AR learning design tool with a bill of materials of about $200, and thus casts light on creating AR-based physical experiments to explore more physical phenomena. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

24 pages, 6283 KiB  
Article
The Dynamic Target Motion Perception Mechanism of Tactile-Assisted Vision in MR Environments
by Wei Wang, Ning Xu, Haiping Liu, Jue Qu, Sina Dang and Xuefeng Hong
Sensors 2022, 22(22), 8931; https://doi.org/10.3390/s22228931 - 18 Nov 2022
Viewed by 1536
Abstract
In the mixed reality (MR) environment, the task of target motion perception is usually undertaken by vision. This approach suffers from poor discrimination and high cognitive load when the tasks are complex. This cannot meet the needs of the air traffic control field [...] Read more.
In the mixed reality (MR) environment, the task of target motion perception is usually undertaken by vision. This approach suffers from poor discrimination and high cognitive load when the tasks are complex. This cannot meet the needs of the air traffic control field for rapid capture and precise positioning of the dynamic targets in the air. Based on this problem, we conducted a multimodal optimization study on target motion perception judgment by controlling the hand tactile sensor to achieve the use of tactile sensation to assist vision in MR environment. This allows it to adapt to the requirements of future development-led interactive tasks under the mixed reality holographic aviation tower. Motion perception tasks are usually divided into urgency sensing for multiple targets and precise position tracking for single targets according to the number of targets and task division. Therefore, in this paper, we designed experiments to investigate the correlation between tactile intensity-velocity correspondence and target urgency, and the correlation between the PRS (position, rhythm, sequence) tactile indication scheme and position tracking. We also evaluated it through comprehensive experiment. We obtained the following conclusions: (1) high, higher, medium, lower, and low tactile intensities would bias human visual cognitive induction to fast, faster, medium, slower, and slow motion targets. Additionally, this correspondence can significantly improve the efficiency of the participants’ judgment of target urgency; (2) under the PRS tactile indication scheme, position-based rhythm and sequence cues can improve the judgment effect of human tracking target dynamic position, and the effect of adding rhythm cues is better. However, when adding rhythm and sequence cues at the same time, it can cause clutter; (3) tactile assisted vision has a good improvement effect on the comprehensive perception of dynamic target movement. The above findings are useful for the study of target motion perception in MR environments and provide a theoretical basis for subsequent research on the cognitive mechanism and quantitative of tactile indication in MR environment. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

14 pages, 1427 KiB  
Review
Existing and Emerging Approaches to Risk Assessment in Patients with Ascending Thoracic Aortic Dilatation
by Nina D. Anfinogenova, Valentin E. Sinitsyn, Boris N. Kozlov, Dmitry S. Panfilov, Sergey V. Popov, Alexander V. Vrublevsky, Alexander Chernyavsky, Tatyana Bergen, Valery V. Khovrin and Wladimir Yu. Ussov
J. Imaging 2022, 8(10), 280; https://doi.org/10.3390/jimaging8100280 - 14 Oct 2022
Cited by 5 | Viewed by 2532
Abstract
Ascending thoracic aortic aneurysm is a life-threatening disease, which is difficult to detect prior to the occurrence of a catastrophe. Epidemiology patterns of ascending thoracic aortic dilations/aneurysms remain understudied, whereas the risk assessment of it may be improved. The electronic databases PubMed/Medline 1966–2022, [...] Read more.
Ascending thoracic aortic aneurysm is a life-threatening disease, which is difficult to detect prior to the occurrence of a catastrophe. Epidemiology patterns of ascending thoracic aortic dilations/aneurysms remain understudied, whereas the risk assessment of it may be improved. The electronic databases PubMed/Medline 1966–2022, Web of Science 1975–2022, Scopus 1975–2022, and RSCI 1994–2022 were searched. The current guidelines recommend a purely aortic diameter-based assessment of the thoracic aortic aneurysm risk, but over 80% of the ascending aorta dissections occur at a size that is lower than the recommended threshold of 55 mm. Moreover, a 55 mm diameter criterion could exclude a vast majority (up to 99%) of the patients from preventive surgery. The authors review several visualization-based and alternative approaches which are proposed to better predict the risk of dissection in patients with borderline dilated thoracic aorta. The imaging-based assessments of the biomechanical aortic properties, the Young’s elastic modulus, the Windkessel function, compliance, distensibility, wall shear stress, pulse wave velocity, and some other parameters have been proposed to improve the risk assessment in patients with ascending thoracic aortic aneurysm. While the authors do not argue for shifting the diameter threshold to the left, they emphasize the need for more personalized solutions that integrate the imaging data with the patient’s genotypes and phenotypes in this heterogeneous pathology. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

30 pages, 1313 KiB  
Review
Applications Analyses, Challenges and Development of Augmented Reality in Education, Industry, Marketing, Medicine, and Entertainment
by Dafnis Cain Villagran-Vizcarra, David Luviano-Cruz, Luis Asunción Pérez-Domínguez, Luis Carlos Méndez-González and Francesco Garcia-Luna
Appl. Sci. 2023, 13(5), 2766; https://doi.org/10.3390/app13052766 - 21 Feb 2023
Cited by 8 | Viewed by 7063
Abstract
This study aims to develop systematic research about augmented reality (AR) problems, challenges, and benefits in the current applications of five fields of interest. Articles were selected from scientific, technical, academic, and medical databases of digital journals and open access papers about AR. [...] Read more.
This study aims to develop systematic research about augmented reality (AR) problems, challenges, and benefits in the current applications of five fields of interest. Articles were selected from scientific, technical, academic, and medical databases of digital journals and open access papers about AR. Therefore, the method used to develop the investigation was PRISMA, which allowed us to observe interesting facts and coincidences about complexities and successful cases of AR implementation in the disciplines of education, marketing, medicine, entertainment, and industry. The summary provided in this study was the result of the exploration of 60 recent articles found and selected by relevance using the PRISMA method. The main objective of this paper is to orient and update researchers regarding current applications, benefits, challenges, and problems in AR implementation for future studies and developments. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

20 pages, 5251 KiB  
Article
A Wireless Hand Grip Device for Motion and Force Analysis
by Victor Becerra, Francisco J. Perales, Miquel Roca, José M. Buades and Margaret Miró-Julià
Appl. Sci. 2021, 11(13), 6036; https://doi.org/10.3390/app11136036 - 29 Jun 2021
Cited by 2 | Viewed by 3430
Abstract
A prototype portable device that allows for simultaneous hand and fingers motion and precise force measurements has been. Wireless microelectromechanical systems based on inertial and force sensors are suitable for tracking bodily measurements. In particular, they can be used for hand interaction with [...] Read more.
A prototype portable device that allows for simultaneous hand and fingers motion and precise force measurements has been. Wireless microelectromechanical systems based on inertial and force sensors are suitable for tracking bodily measurements. In particular, they can be used for hand interaction with computer applications. Our interest is to design a multimodal wireless hand grip device that measures and evaluates this activity for ludic or medical rehabilitation purposes. The accuracy and reliability of the proposed device has been evaluated against two different commercial dynamometers (Takei model 5101 TKK, Constant 14192-709E). We introduce a testing application to provide visual feedback of all device signals. The combination of interaction forces and movements makes it possible to simulate the dynamic characteristics of the handling of a virtual object by fingers and palm in rehabilitation applications or some serious games. The combination of these above mentioned technologies and open and portable software are very useful in the design of applications for assistance and rehabilitation purposes that is the main objective of the device. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

17 pages, 23943 KiB  
Article
The More, the Better? Improving VR Firefighting Training System with Realistic Firefighter Tools as Controllers
by Seunggon Jeon, Seungwon Paik, Ungyeon Yang, Patrick C. Shih and Kyungsik Han
Sensors 2021, 21(21), 7193; https://doi.org/10.3390/s21217193 - 29 Oct 2021
Cited by 4 | Viewed by 3273
Abstract
A virtual reality (VR) controller plays a key role in supporting interactions between users and the virtual environment. This paper investigates the relationship between the user experience and VR control device modality. We developed a VR firefighting training system integrated with four control [...] Read more.
A virtual reality (VR) controller plays a key role in supporting interactions between users and the virtual environment. This paper investigates the relationship between the user experience and VR control device modality. We developed a VR firefighting training system integrated with four control devices adapted from real firefighting tools. We iteratively improved the controllers and VR system through a pilot study with six participants and conducted a user study with 30 participants to assess two salient human factor constructs—perceived presence and cognitive load—with three device modality conditions (two standard VR controllers, four real tools, and a hybrid of one real tool and one standard VR controller). We found that having more realistic devices that simulate real tools does not necessarily guarantee a higher level of user experience, highlighting a strategic approach to the development and utilization of VR control devices. Our study gives empirical insights on establishing appropriate combinations of VR control device modality in the context of field-based VR simulation and training. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

27 pages, 10619 KiB  
Article
Exploration and Assessment of Interaction in an Immersive Analytics Module: A Software-Based Comparison
by Sofia Karam, Raed Jaradat, Michael A. Hamilton, Vidanelage L. Dayarathna, Parker Jones and Randy K. Buchanan
Appl. Sci. 2022, 12(8), 3817; https://doi.org/10.3390/app12083817 - 10 Apr 2022
Viewed by 1480
Abstract
The focus of computer systems in the field of visual analytics is to make the results clear and understandable. However, enhancing human-computer interaction (HCI) in the field is less investigated. Data visualization and visual analytics (VA) are usually performed using traditional desktop settings [...] Read more.
The focus of computer systems in the field of visual analytics is to make the results clear and understandable. However, enhancing human-computer interaction (HCI) in the field is less investigated. Data visualization and visual analytics (VA) are usually performed using traditional desktop settings and mouse interaction. These methods are based on the window, icon, menu, and pointer (WIMP) interface, which often results in information clutter and is difficult to analyze and understand, especially by novice users. Researchers believe that introducing adequate, natural interaction techniques to the field is necessary for building effective and enjoyable visual analytics systems. This work introduces a novel virtual reality (VR) module to perform basic visual analytics tasks and aims to explore new interaction techniques in the field. A pilot study was conducted to measure the time it takes students to perform basic tasks for analytics using the developed VR module and compares it to the time it takes them to perform the same tasks using a traditional desktop to assess the effectiveness of the VR module in enhancing student’s performance. The results show that novice users (Participants with less programming experience) took about 50% less time to complete tasks using the developed VR module as a comrade to a programming language, notably R. Experts (Participants with advanced programming experience) took about the same time to complete tasks under both conditions (R and VR). Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

12 pages, 1585 KiB  
Review
Perception and Action under Different Stimulus Presentations: A Review of Eye-Tracking Studies with an Extended View on Possibilities of Virtual Reality
by Florian Heilmann and Kerstin Witte
Appl. Sci. 2021, 11(12), 5546; https://doi.org/10.3390/app11125546 - 15 Jun 2021
Cited by 7 | Viewed by 3281
Abstract
Visual anticipation is essential for performance in sports. This review provides information on the differences between stimulus presentations and motor responses in eye-tracking studies and considers virtual reality (VR), a new possibility to present stimuli. A systematic literature search on PubMed, ScienceDirect, IEEE [...] Read more.
Visual anticipation is essential for performance in sports. This review provides information on the differences between stimulus presentations and motor responses in eye-tracking studies and considers virtual reality (VR), a new possibility to present stimuli. A systematic literature search on PubMed, ScienceDirect, IEEE Xplore, and SURF was conducted. The number of studies examining the influence of stimulus presentation (in situ, video) is deficient but still sufficient to describe differences in gaze behavior. The seven reviewed studies indicate that stimulus presentations can cause differences in gaze behavior. Further research should focus on displaying game situations via VR. The advantages of a scientific approach using VR are experimental control and repeatability. In addition, game situations could be standardized and movement responses could be included in the analysis. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

17 pages, 3878 KiB  
Article
Mixed Reality-Enhanced Intuitive Teleoperation with Hybrid Virtual Fixtures for Intelligent Robotic Welding
by Yun-Peng Su, Xiao-Qi Chen, Tony Zhou, Christopher Pretty and Geoffrey Chase
Appl. Sci. 2021, 11(23), 11280; https://doi.org/10.3390/app112311280 - 29 Nov 2021
Cited by 15 | Viewed by 3669
Abstract
This paper presents an integrated scheme based on a mixed reality (MR) and haptic feedback approach for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual [...] Read more.
This paper presents an integrated scheme based on a mixed reality (MR) and haptic feedback approach for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed robotic tele-welding system features imitative motion mapping from the user’s hand movements to the welding robot motions, and it enables the spatial velocity-based control of the robot tool center point (TCP). The proposed mixed reality virtual fixture (MRVF) integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Onsite welding and tele-welding experiments identify the operational differences between professional and unskilled welders and demonstrate the effectiveness of the proposed MRVF tele-welding framework for novice welders. The MRVF-integrated visual/haptic tele-welding scheme reduced the torch alignment times by 56% and 60% compared to the MRnoVF and baseline cases, with minimized cognitive workload and optimal usability. The MRVF scheme effectively stabilized welders’ hand movements and eliminated undesirable collisions while generating smooth welds. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

19 pages, 19626 KiB  
Article
Usability Testing of Virtual Reality Applications—The Pilot Study
by Dorota Kamińska, Grzegorz Zwoliński and Anna Laska-Leśniewicz
Sensors 2022, 22(4), 1342; https://doi.org/10.3390/s22041342 - 10 Feb 2022
Cited by 31 | Viewed by 6587
Abstract
The need for objective data-driven usability testing of VR applications is becoming more tangible with the rapid development of numerous VR applications and their increased accessibility. Traditional methods of testing are too time and resource consuming and might provide results that are highly [...] Read more.
The need for objective data-driven usability testing of VR applications is becoming more tangible with the rapid development of numerous VR applications and their increased accessibility. Traditional methods of testing are too time and resource consuming and might provide results that are highly subjective. Thus, the aim of this article is to explore the possibility of automation of usability testing of VR applications by using objective features such as HMD built-in head and hands tracking, EEG sensor, video recording, and other measurable parameters in addition to automated analysis of subjective data provided in questionnaires. For this purpose, a simple VR application was created which comprised relatively easy tasks that did not generate stress for the users. Fourteen volunteers took part in the study and their signals were monitored to acquire objective automated data. At the same time the observer was taking notes of subjects’ behaviour, and their subjective opinions about the experience were recorded in a post-experiment questionnaire. The results acquired from signal monitoring and questionnaires were juxtaposed with observation and post-interview results to confirm the validity and efficacy of automated usability testing. The results were very promising, proving that automated usability testing of VR applications is potentially achievable. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

13 pages, 2399 KiB  
Article
Can ADAS Distract Driver’s Attention? An RGB-D Camera and Deep Learning-Based Analysis
by Luca Ulrich, Francesca Nonis, Enrico Vezzetti, Sandro Moos, Giandomenico Caruso, Yuan Shi and Federica Marcolin
Appl. Sci. 2021, 11(24), 11587; https://doi.org/10.3390/app112411587 - 7 Dec 2021
Cited by 6 | Viewed by 1940
Abstract
Driver inattention is the primary cause of vehicle accidents; hence, manufacturers have introduced systems to support the driver and improve safety; nonetheless, advanced driver assistance systems (ADAS) must be properly designed not to become a potential source of distraction for the driver due [...] Read more.
Driver inattention is the primary cause of vehicle accidents; hence, manufacturers have introduced systems to support the driver and improve safety; nonetheless, advanced driver assistance systems (ADAS) must be properly designed not to become a potential source of distraction for the driver due to the provided feedback. In the present study, an experiment involving auditory and haptic ADAS has been conducted involving 11 participants, whose attention has been monitored during their driving experience. An RGB-D camera has been used to acquire the drivers’ face data. Subsequently, these images have been analyzed using a deep learning-based approach, i.e., a convolutional neural network (CNN) specifically trained to perform facial expression recognition (FER). Analyses to assess possible relationships between these results and both ADAS activations and event occurrences, i.e., accidents, have been carried out. A correlation between attention and accidents emerged, whilst facial expressions and ADAS activations resulted to be not correlated, thus no evidence that the designed ADAS are a possible source of distraction has been found. In addition to the experimental results, the proposed approach has proved to be an effective tool to monitor the driver through the usage of non-invasive techniques. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

16 pages, 5794 KiB  
Article
Interaction in eXtended Reality Applications for Cultural Heritage
by Vensada Okanovic, Ivona Ivkovic-Kihic, Dusanka Boskovic, Bojan Mijatovic, Irfan Prazina, Edo Skaljo and Selma Rizvic
Appl. Sci. 2022, 12(3), 1241; https://doi.org/10.3390/app12031241 - 25 Jan 2022
Cited by 34 | Viewed by 5902
Abstract
Digital technologies in the modern era are almost mandatory for the presentation of all types of cultural heritage. Virtual depictions of crafts and traditions offer the users the possibility of time travel, taking them to the past through the use of 3D reconstructions [...] Read more.
Digital technologies in the modern era are almost mandatory for the presentation of all types of cultural heritage. Virtual depictions of crafts and traditions offer the users the possibility of time travel, taking them to the past through the use of 3D reconstructions of cultural monuments and sites. However, digital resources alone are not enough to adequately present cultural heritage. Additional information on the historical context in the form of stories, virtual reconstructions, and digitized objects is needed. All of this can be implemented using a digital multimedia presentation technique called digital storytelling. Nowadays, an integral part of many museum exhibitions is interactive digital storytelling. This paper gives an overview of the techniques and discusses different means of facilitating interaction on digital storytelling applications for virtual cultural heritage presentations. We describe the ways in which natural interaction and interaction via eXtended Reality (Virtual and Augmented Reality) applications for cultural heritage are made possible. Users will find the stories told through these applications educational and entertaining at the same time. Through user-experience studies, we measure the user edutainment level and present how users react to implemented interactions. Full article
(This article belongs to the Topic Extended Reality (XR): AR, VR, MR and Beyond)
Show Figures

Figure 1

Back to TopTop