Next Article in Journal
Self-Supervised Joint Learning for pCLE Image Denoising
Previous Article in Journal
Robust Tracking Control of Wheeled Mobile Robot Based on Differential Flatness and Sliding Active Disturbance Rejection Control: Simulations and Experiments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Implementation and Evaluation of Walk-in-Place Using a Low-Cost Motion-Capture Device for Virtual Reality Applications

Department of Computer Science, Gyeongsang National University, Jinju-si 52828, Republic of Korea
*
Authors to whom correspondence should be addressed.
Sensors 2024, 24(9), 2848; https://doi.org/10.3390/s24092848
Submission received: 7 March 2024 / Revised: 25 April 2024 / Accepted: 29 April 2024 / Published: 30 April 2024
(This article belongs to the Section Navigation and Positioning)

Abstract

:
Virtual reality (VR) is used in many fields, including entertainment, education, training, and healthcare, because it allows users to experience challenging and dangerous situations that may be impossible in real life. Advances in head-mounted display technology have enhanced visual immersion, offering content that closely resembles reality. However, several factors can reduce VR immersion, particularly issues with the interactions in the virtual world, such as locomotion. Additionally, the development of locomotion technology is occurring at a moderate pace. Continuous research is being conducted using hardware such as treadmills, and motion tracking using depth cameras, but they are costly and space-intensive. This paper presents a walk-in-place (WIP) algorithm that uses Mocopi, a low-cost motion-capture device, to track user movements in real time. Additionally, its feasibility for VR applications was evaluated by comparing its performance with that of a treadmill using the absolute trajectory error metric and survey data collected from human participants. The proposed WIP algorithm with low-cost Mocopi exhibited performance similar to that of the high-cost treadmill, with significantly positive results for spatial awareness. This study is expected to contribute to solving the issue of spatial constraints when experiencing infinite virtual spaces.

1. Introduction

Virtual reality (VR) is a technological revolution that combines the real and virtual worlds to transport users to various novel environments. It can present environments and situations that may be difficult to experience in real life and offers innovative experiences in entertainment, education, training, medicine, and many other fields [1]. VR technology integrates visual, auditory, and physical senses to provide users with an experience that closely resembles reality, and various technical tools are essential for enabling interactions that mimic real-world behavior within the VR environment. Currently, accurately reflecting real-world movements in VR is challenging due to spatial and technical constraints associated with recognizing movements in real environments and reflecting them in virtual ones [2]. Although large virtual spaces can be theoretically implemented in VR, spatial constraints are encountered when implementing movements such as walking. These include tracking hardware and network limitations, physical obstacles, and intricate technical issues related to conveying user movements to the virtual environment [3].
A potential solution to these challenges is to improve locomotion in VR environments [4]. Locomotion refers to controlling user movements within the virtual environment and is a crucial factor for improving user experience [5]. Methods that enable users to move naturally in VR environments as they would while walking or running during everyday activities in the real world are being actively researched. Locomotion research has employed various methods, with walk-in-place (WIP) being the most actively researched [2,6]. An evaluation of locomotion methods demonstrated that users with no prior VR experience rated WIP as being more immersive than controller manipulation [5]. Various methods have been proposed for implementing WIP, wherein the Kinect camera, which can measure depth [7,8], trackers that can track body movements [9], treadmills [10], or body-worn inertial measurement unit sensors have been employed [11].
Developing WIP techniques that can represent real-world movements in VR environments has been a long-standing challenge. Despite numerous technological advancements, several gaps remain, primarily the limited amount of physical space. The main objective of WIP techniques is to provide an immersive experience by translating real-world movements into the virtual world and simulating an infinite virtual space within a confined physical space. However, WIP experiences through previously proposed methods require considerable space to set up the camera and perform actions in a confined space or using bulky hardware such as a treadmill. Additionally, establishing a WIP environment is expensive, and accurate tracking technology is required to replicate real-world motion in VR environments. However, procuring multiple Kinect cameras, trackers, sensors, and treadmills can entail significant costs.
This paper presents a WIP algorithm that uses low-cost motion-capture equipment and demonstrates its potential for use in VR applications. By monitoring the walking behavior in real time using body-worn sensors and applying the collected data to the WIP algorithm, the system can enhance user immersion by providing meaningful interactions in virtual worlds. In the performance evaluations, the absolute trajectory errors (ATEs) were compared with the 3D coordinates obtained by walking along a specified path in VR using Mocopi and a treadmill. The final performance was evaluated based on the results of a user survey.
The remainder of this paper is organized as follows. Section 2 presents a detailed review of related studies. Section 3 describes the proposed configuration and algorithm for WIP implementation. Section 4 analyzes and discusses the experimental results to evaluate the performance of the proposed technique. Section 5 discusses the limitations, and theoretical and managerial implications, of the study. Section 6 summarizes the significance and findings of the study and presents the conclusions.

2. Related Works

VR is a revolutionary technology that allows users to interact with the real world by transporting them into virtual environments that differ from the real world. However, replicating natural movements in these environments remains technically complex. WIP technology is a key solution for mimicking the walking behavior of a user in a VR environment, and considerable research and development has been conducted in this field. A key aspect of WIP research is the development and exploration of various techniques, which has contributed to improving both the user experience and system performance.
For instance, the use of hardware such as bracelets to track arm movements while walking has been explored. The obtained data can be interpreted as movement commands using algorithms that enable movement within a virtual space [12]. Implementing WIP using hands can be simple and offer high accuracy. However, this implies the standard hand-based interaction typically used for interaction in the virtual world becomes unfeasible because the hands are employed for this specific purpose.
Another area of research involves techniques that utilize the leg movements that accompany regular movements. These WIP methods leverage the physical characteristics of how humans lift their legs while walking to track the motion of the lifted leg. Thus, they imitate real-world walking in more detail, and various types of WIP methods are being developed. Initially, a complex mix of sensors was used, but advances in small-scale sensor technology have enabled the utilization of sensors built into head-mounted displays (HMDs), smartphones, and Kinect cameras. These methods can track body movements in detail and provide users with realistic walking experiences.
WIP techniques that employ Kinect cameras are divided into single- and multi-camera methods. Single-camera methods feature high body recognition rates when the user is looking straight ahead or turning; however, they face issues in tracking occluded areas when the body is turned at an angle or sideways. Therefore, multicamera methods have been widely studied to address the limitations of single cameras [13,14,15].
However, implementing WIP using a Kinect camera presents the challenge of moving within the camera’s field of view owing to its distance and space limitations [16]. Methods that employ physical sensors offer a more robust tracking performance; however, depending on the sensor, they may require external assistance or incur significant costs. Additionally, various WIP techniques have been studied, including tracking and extracting data from physical motion or mechanically determining the walking motion [2]. Therefore, WIP research is ongoing with the aim of understanding the diversity of WIP technologies, their strengths and weaknesses, and ways to improve them [9].
Researchers have also investigated the use of tracking devices attached to the body, such as the VIVE tracker, which exhibits high accuracy [17]. However, these trackers re-quire external sensors owing to their inherent hardware characteristics. Body-attached tracking devices may miss tracks if the user moves out of the sensor area or the tracker is obstructed by an object. Additionally, compatibility issues may arise when using devices other than the specified device depending on the type of tracker used. Moreover, research on using treadmills for WIP applications is ongoing with constant hardware advancements [18]. This method offers the advantage of virtually replicating the real-world movements of users. However, the cost of creating such an environment remains challenging.
Recent research on WIP has focused on 360-degree omnidirectional walking and treadmills as important locomotion technologies, along with omnidirectional walking trackers attached to the body to mimic larger-than-life environments [19]. These methods offer locomotion without the need for a large physical space and induce lower dizziness than traditional locomotion techniques [20]. However, they are expensive or bulky, making their commercialization challenging.
These findings indicate that WIP is an important technique for mimicking locomotion in VR environments and various studies continue to improve its performance and evaluate its advantages and disadvantages. This continued effort to provide a natural VR experience plays a significant role in the development of VR technologies.

3. Materials and Methods

This section describes the setup employed for detecting user movements and implementing the proposed WIP technique. Real-time body data obtained through motion-capture technologies are necessary to ensure smooth virtual interactions. Various devices exist for obtaining accurate body data, such as Kinect cameras, which capture the body from the outside, or sensors attached to the body. Therefore, this chapter introduces the hardware selection and communication methods, along with algorithms for recognizing body data collected via motion-capture devices and implementing WIP.

3.1. Trackers

Various types of hardware at different price points are available that involve attaching sensors to the body to collect data, including acceleration, gyro, and pressure sensors. This study employed the Mocopi system comprising six sensors, each weighing approximately 8 g, owing to its ease of use and accurate tracking performance. These sensors must be attached to the head, waist, wrist, and ankle for tracking. Additionally, Mocopi is 20 times cheaper than a treadmill and 2.5 times cheaper than the VIVE Sensor and, offers excellent tracking performance.

3.2. HMDs

An HMD displays a virtual world in 3D through two displays situated in front of the eyes, thereby creating a realistic and immersive experience. This study focused on Meta Quest 2 and VIVE Pro HMDs as visual aids for VR; however, other HMDs have also been developed for this purpose. This paper reports on the experiments conducted using Meta Quest 2 and VIVE Pro for implementing the proposed WIP algorithm using Mocopi and a treadmill, respectively.

3.3. Treadmill

A treadmill can be used to implement the walking or running motion of a user in a VR environment. It provides a realistic experience by allowing users to physically move around a virtual world. However, it has some disadvantages such as high cost and weight, bulkiness, and space limitations for installation. However, we selected it as a comparison device for implementing the proposed WIP algorithm owing to its high tracking accuracy and immersiveness compared to other locomotion hardware. Additionally, its performance was compared with that of Mocopi using ATEs and user surveys.

3.4. Data Communication between Smartphone and PC

Next, the process of sending and receiving data through Mocopi is described. There is a direct connection between Mocopi and a PC, but not in real time, so we used Bluetooth. First, we ran the Mocopi application on a smartphone and established a Bluetooth connection between the smartphone and Mocopi. The data were then transferred via Wi-Fi to a PC on the same network for visualization. The communication and internet protocols used were the user datagram protocol and IPv4, respectively.

3.5. Data Communication between PC and HMD

To enable 360-degree WIP, the rotation of the user must be unrestricted. This study aimed to enhance freedom of movement without using a data cable, which limits rotation. When using Meta Quest 2, the screen data are wirelessly transferred from the PC to the HMD through the air-link function, which is available when the PC and HMD are on the same network and uses the 5 GHz channel to prevent communication delays that may cause cognitive dissonance in the user. VIVE Pro allows unrestricted user movement owing to the absence of wires connecting the HMD to the PC and, instead, employs a special cable attached to the ceiling. Additionally, it is connected to a PC via a DisplayPort cable, and visual data are sent and received through a hub.

3.6. Calibration

Prior to motion capture, the equipment must be calibrated to obtain accurate body data. Mocopi uses a dedicated application for calibration, which involves a straightforward process. However, errors can accumulate over time owing to the nature of the equipment, necessitating regular calibration.
Prior to calibration, data pertaining to the position of the sensors and the user’s information are collected. This is based on three pieces of information: the user’s height, their default pose, and their movement from the default pose to the default pose once more. Based on the aforementioned data, the user’s pose error is calibrated. We performed calibration at the start of the study and after any change in location.

3.7. Motion Capture

Motion-capture technology extracts 3D data of the body movements using sensors or cameras, and we used calibrated Mocopi data to estimate the joint positions and postures of users. Mocopi detects the acceleration and angular velocity of body movements using six sensors that must be worn on the head, waist, wrists, and ankles to calculate and apply 3D positions and postures. It uses 3 accelerometers and 3 gyroscopes to digitize the movements for determining the sensor positions and acceleration data to identify the 3D position. Up to this point, positioning errors accumulate in the system and reduce accuracy, but Sony’s proprietary artificial intelligence (AI) model directly estimates the joint positions to minimize positioning errors [21,22,23].
The problem with tracking the positions of joints not attached to the sensors is that owing to the complexity of the human body and the high degrees of freedom of the joints, the position and posture of the intermediate joints connecting two specific locations where the sensors are attached cannot be uniquely determined through simple geometric calculations. However, the AI models trained on different human movements are calibrated to the natural positions of the joints. Figure 1 and Table 1 show the human skeletal structure defined by Mocopi, and Figure 2 shows the positions of the joints without sensors estimated from the locations of the sensor attachments.

3.8. WIP Algorithm with Mocopi

Various physical movements can be observed while a person walks, such as arms swinging, knees bending, and the soles of the feet lifting off the ground. This study proposes an algorithm based on the behavior of a user walking in place to analyze body movements during walking using data from sensors attached to the ankle and waist. It involves the user walking in place, with one foot on the ground and the other off the ground. The imaginary line formed by the foot on the ground and the waist sensor is called “baseline A,” whereas that formed by the foot off the ground and the waist sensor is called “baseline B.” The angle between the two reference lines is measured to determine the user movement, as illustrated in Figure 3. The angle formed by the two baselines is calculated as follows:
c o s θ = A · B A B ,
θ = c o s 1 A · B A B ,
where (1) represents the dot product, and (2) represents the transformation of (1).
Although the angle between the two baselines was used to determine the gait of the user, using this method alone can cause malfunctions in certain situations. For instance, the angle change that occurs while the user is standing, as shown in Figure 4, can be mistaken for a walking motion. This confusion can arise when the user is performing a WIP motion or raising and lowering their legs. To address these issues, this study presents a technique for detecting the moment when a foot touches the ground and falls by attaching a collider component to the 3D model. A collider is a physical collision-detection component in the Unity game engine that precisely detects the interactions between the foot and the ground. The “OnTriggerStay” function is activated when the foot comes into contact with the ground, whereas the “OnTriggerExit” function is activated when it falls. In Unreal engine, we can use “OnComponentBeginOverlap” and “OnComponentEndOverlap” functions. This allowed us to precisely determine the user’s movement based solely on the internal angle when the foot is not in contact with the ground. Figure 5 shows the process of implementing WIP using the aforementioned internal angles.

3.9. Designing the WIP User Interface

A visualization method was employed to display the captured motion data to users. Figure 6a shows a 3D Unity model of the user’s body data collected via Mocopi, whereas the user’s point of view while wearing the HMD and experiencing it is shown on Figure 6b. Figure 7 shows how to visually verify the walking motion, and the angles of the right and left feet. That is shown in Figure 6b as text on the top left and right corners of the screen, respectively. A sub-camera was placed at the bottom right to track the entire body and verify the direction of movement. The white dotted line at the center of the road was generated based on the target data. The width of the road, marked with a yellow solid line, was set to 1 m.

4. Experiments

4.1. Experimental Setup

The experiment used the Vector3 values generated by traveling along the three pre-created paths shown in Figure 8 to evaluate the performance of the proposed WIP technique with Mocopi. The 1 m wide path was created using the Bézier Path Creator asset, which is a free asset. Comparative data were extracted through comparisons with a treadmill, which is a device, and the results are presented through graphs and values. The experiment included 16 participants of both sexes with varying ages, ranging from 20 to 70. The participants were divided into two groups of eight each, using a between-subjects design comprising Mocopi and treadmill conditions. To ensure that the participants did not gain knowledge of the maps and the WIP technique during the experiment, they experienced either the Mocopi or treadmill, and the maps were presented in a random order. Furthermore, an objective survey was employed to evaluate both methods. The experiment was conducted under the following conditions:
  • Application: Unity 2021.3.8.f1.
  • System specifications: AMD Ryzen 5 5600×/32 GB RAM/AMD RX 6700 XT GPU with 12 GB of GDDR6 memory.
  • HMDs: Meta Quest2 and VIVE Pro.
  • Trackers: Mocopi and KAT Walk Mini S.
To obtain the Vector3 values of the map, the position values were extracted point-by-point in the CSV format for the x-, y-, and z-axes, with the y-axis set to zero to align it with the ground. Table 2 summarizes the number and size of the Vector3 values of the target path.
The line renderer feature was used to visualize the user’s path and extract the collected data in the CSV format, as shown in Figure 9. Only the x and z values were required for data comparison; therefore, the y value was set to zero.

4.2. Experimental Results

4.2.1. Performance Evaluations

This study employed ATE, a commonly used metric for path evaluations, using data obtained from the target and actual path values of the user. The ATE is an important metric in various applications and is primarily related to path tracking. It measures the error between the target and user paths, wherein a smaller value indicates higher matching between the paths.
The user index must remain constant to obtain an accurate cumulative distance ATE. However, the obtained Vector3 index is always different because not all users travel the same path. In this study, new points were created after every 1 m based on the distance of the user path to obtain accurate ATE values. Additionally, new points were created for the target path at 1 m increments for the same environment, and the absolute path lengths are listed in Table 3.
Sixteen users participated in the experiment, consisting of a variety of ages ranging from their 20s to 60s, and included both genders. To ensure a fair trial, the participants were split into two groups: eight users experienced the Mocopi and the other eight used the treadmill.
To obtain the ATE values, the target and user path data were matched in a 1:1 ratio, and ATE (m) was calculated by measuring and adding the distances between points. Figure 10 illustrates the target (blue) and user (red) paths with each point matched. The difference in the distance between the target and user paths is indicated by the green line. Figure 11 illustrates the Mocopi, treadmill, and target paths for one of the users who experienced the butterfly-shaped path, and Table 4 presents their ATEs. Comparisons with the corresponding ATE values show that Mocopi was more accurate, with a mean ATE of 61.48 compared to 66.30 for the treadmill. However, the standard deviation between the treadmill users is better. Figure 12 shows the Mocopi, treadmill, and target paths for one of the users who experienced the Korean Peninsula-shaped path, and Table 5 presents their ATEs. The average ATE of the treadmill users was 64.48, whereas that of Mocopi users was 63.59, indicating that Mocopi outperformed the treadmill. However, the standard deviation between the treadmill users is better. Figure 13 illustrates the Mocopi, treadmill, and target paths for one of the users who experienced the star-shaped path, and Table 6 lists their ATEs. The average ATE of the treadmill was 64.48 points, whereas that of Mocopi was 63.59 points, indicating that Mocopi outperformed the treadmill; however, the standard deviation between treadmill users was better.
By comparing the WIP performances of the Mocopi and treadmill users with the ATE results, slight variations were observed among individuals; however, they were insufficient to be noticeable in VR. However, the graph and the tables indicate that some users recorded higher than average ATE values. We investigated the causes for this through interviews, and found that this was primarily caused by inexperience. Reasons other than poor treadmill manipulation were identified as unwanted movements caused by the sensor reacting to objects other than the feet, such as clothing, resulting in mis-operation.

4.2.2. User Evaluations

Although ATE figures provide statistical representations of the target and user paths, they do not include several evaluation metrics such as user immersion, wearability, and spatiality. This study comprehensively evaluated the user experiences of using the proposed WIP in conjunction with Mocopi and a treadmill through a survey comprising nine questions regarding immersion, wearability, convenient, reality, difficulty, responsiveness, connectivity, freedom, and spatiality, as follows:
  • Immersion: Is the level of immersion in your WIP experience satisfactory?
  • Wearability: Did you experience any physical discomfort during your WIP experience?
  • Convenience: Were you physically comfortable during your WIP experience?
  • Reality: Are you satisfied with the realism of WIP?
  • Difficulty: What were some of the challenges you encountered while trying out WIP?
  • Responsiveness: Are you satisfied with the real-time movement of WIP?
  • Connectivity: How would you evaluate the connection between the physical and virtual worlds when using WIP technology?
  • Freedom: What is your evaluation of the increased freedom provided by WIP?
  • Spatiality: Are you satisfied with the space allocated in WIP?
Each question could be answered with a rating on a scale of 1–5, and a higher score indicated a better evaluation. The survey was administered to all 16 participants.
Among the nine survey questions, as shown in Table 7, Mocopi scored higher than the treadmill in four. The treadmill scored higher for immersion, realism, responsiveness, and connectedness because it employed a normal walking motion on a slip pad rather than just walking in place. However, the treadmill scored lower than Mocopi for wearability, difficulty of use, ease of use, and space because it requires considerable energy to operate and is bulky, making it difficult to move it from one place to another.

5. Discussion

5.1. Limitations

The low-cost motion-capture device employed in this study, Mocopi, exhibits several advantages. However, it is important to note that it also presents several technical limitations. One such limitation is the potential for delays to occur in the real-time processing and transmission of sensor data. This is largely due to the inherent limitations of data transmission over Bluetooth connectivity. Such delays can potentially lead to a reduction in the immersion and user experience of VR, which is a significant factor in this regard. This represents a significant challenge, particularly for VR applications that necessitate dynamic and real-time responses. Consequently, future research should focus on technical improvements to minimize this delay. Furthermore, the development of more reliable and faster data transfer protocols is essential to address connectivity issues. These technological advances will contribute significantly to the commercialization of VR technology and improve the quality of the user experience.
One of the limitations of this study is the small sample size. The dataset used is not large enough to allow for generalizability of the results. In particular, a larger and more diverse sample is needed to assess the effectiveness of the algorithm in different settings and conditions. This could be an important factor in assessing the universality and reliability of the algorithm, and future research should be conducted using a larger sample.

5.2. Theoretical Implications

This research extends existing theories on VR technology through the development of a low-cost VR tracking algorithm. In particular, it is significant in that it advances our understanding of how to make VR technology more cost-effective and accessible. By exploring the impact of low-cost technologies on the performance and user experience of VR systems, this research strengthens the theoretical foundations of technology acceptance models and user experience. Furthermore, it contributes to the theoretical debate on the impact of technological innovation on user adoption by presenting different approaches to lowering the barriers to VR technology through efficient cost structures.

5.3. Managerial Implications

The results of this study have significant implications for companies that are considering the commercial exploitation of VR technology. The development of low-cost algorithms provides an opportunity to make VR technology accessible, especially to small and medium-sized enterprises and startups with limited budgets. This will enable companies that are interested in utilizing VR in education, training, marketing, product development, and other areas to deliver high-quality VR experiences at a lower cost. The research also provides useful guidance for the development of marketing strategies to promote the adoption of VR technology and reach a wider range of users. Companies can use the findings to develop more affordable VR solutions and create strategies to increase user acceptance of the technology.

5.4. Significance of the Results

The low-cost VR tracking algorithm developed in this research represents a significant step forward in lowering the cost barrier for VR technology and making it accessible to a wider range of users. By demonstrating that efficient tracking can be achieved at low cost, it opens the door to a wider range of applications for VR technology in education, training, and entertainment. This study also provides empirical evidence of the positive impact that the development of a low-cost tracking solution has on the user’s VR experience. These data are of significant value for theoretical discussions on technology acceptance models and user experience. Future research can further develop these techniques and contribute to the commercial utilization and widespread adoption of VR technology.

6. Conclusions

VR technology has significantly advanced over the years. Although improvements in hardware performance have enhanced its visual impact, providing realistic and natural interactions between the user and VR environment remains challenging. We aimed to allow users to move freely between the real and virtual environments, thereby providing an excellent experience without physical constraints.
This study employed Mocopi, a low-cost motion-capture device, to collect and analyze user movements in real time. Mocopi comprises six small sensors that must be attached to the user’s body and can operate without external sensors. The proposed WIP algorithm can control user movements in a VR environment more naturally without spatial constraints.
To compare the performance of the WIP algorithm, the participants were asked to walk along three target paths using an expensive treadmill, and their ATE values were compared. The results showed that Mocopi performed better than the treadmill. To further evaluate the user experience, we conducted a user survey and found that the treadmill was superior in terms of immersion and realism, whereas Mocopi was superior in terms of spatiality, convenience, and wearability. These results demonstrate the superiority of the proposed WIP algorithm when using Mocopi in VR environments.
In this study, we attempted to mitigate the limitations of the sample size by employing multiple paths, including a star, butterfly, and peninsula shapes. Despite these efforts, we recognize that the small sample size may affect the results of the study. In future studies, we will strive to overcome this limitation by increasing the sample size.
This study strived to make immersive VR experiences more accessible by introducing a low-cost, real-time, full-track technology that can reduce costs by at least 3 times and up to 20 times compared with other WIP techniques. Thus, it can allow users to have an excellent VR experience and break the boundaries between real and virtual environments without incurring excessive costs. As a potential avenue for future research, it is necessary to investigate the possibility of seamlessly experiencing real-world movements in a VR environment without additional motion recognition devices and controllers.

Author Contributions

Conceptualization, R.S.; methodology, R.S.; software, B.C.; validation, S.-M.C. and S.L.; formal analysis, S.-M.C. and S.L.; investigation, R.S.; resources, B.C.; data curation, B.C.; writing—original draft preparation, R.S.; writing—review and editing, S.-M.C. and S.L.; visualization, R.S.; supervision, S.-M.C. and S.L.; project administration, S.-M.C. and S.L.; funding acquisition, S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Regional Innovation Strategy through the National Research Foundation of Korea funded by the Ministry of Education under Grant 2021RIS-003; and in part by the fund of research promotion program, Gyeongsang National University, in 2022.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Chung, D.H. User-based theories and practices on virtual reality. Inf. Policy 2017, 24, 3–29. [Google Scholar]
  2. Boletsis, C.; Chasanidou, D. A typology of virtual reality locomotion techniques. Multimodal Technol. Interact. 2022, 6, 72. [Google Scholar] [CrossRef]
  3. Park, J.H.; Kim, T.K. Representation of Physical Phenomena and Spatial Relations in the Virtual Reality. J. Korea Contents Assoc. 2012, 12, 21–31. [Google Scholar] [CrossRef]
  4. Cherni, H.; Métayer, N.; Souliman, N. Literature review of locomotion techniques in virtual reality. Int. J. Virtual Real. 2020, 20, 1–20. [Google Scholar] [CrossRef]
  5. Boletsis, C.; Cedergren, J.E. VR locomotion in the new era of virtual reality: An empirical comparison of prevalent techniques. Adv. Hum. Comput. Interact. 2019, 2019, 7420781. [Google Scholar] [CrossRef]
  6. Martinez, E.S.; Wu, A.S.; McMahan, R.P. Research trends in virtual reality locomotion techniques. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Christchurch, New Zealand, 12–16 March 2022; IEEE Publications: Piscataway, NJ, USA, 2022; Volume 2022, pp. 270–280. [Google Scholar] [CrossRef]
  7. Ioannou, C.; Archard, P.; O’Neill, E.; Lutteroth, C. Virtual performance augmentation in an immersive jump & run exergame. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–15. [Google Scholar] [CrossRef]
  8. Langbehn, E.; Eichler, T.; Ghose, S.; von Luck, K.; Bruder, G.; Steinicke, F. Evaluation of an omnidirectional walking-in-place user interface with virtual locomotion speed scaled by forward leaning angle. In Proceedings of the GI Workshop on Virtual and Augmented Reality (GI VR/AR), Sankt Augustin, Germany, 10–11 September 2015; pp. 149–160. [Google Scholar]
  9. Tan, C.T.; Foo, L.C.; Yeo, A.; Lee, J.S.A.; Wan, E.; Kok, X.K.; Rajendran, M. Understanding user experiences across VR Walking-in-Place locomotion methods. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022; pp. 1–13. [Google Scholar] [CrossRef]
  10. Bashir, A.; De Regt, T.; Jones, C.M. Comparing a friction-based uni-directional treadmill and a slip-style omni-directional treadmill on first-time HMD-VR user task performance, cybersickness, postural sway, posture angle, ease of use, enjoyment, and effort. Int. J. Hum. Comput. Stud. 2023, 179, 103101. [Google Scholar] [CrossRef]
  11. Feasel, J.; Whitton, M.C.; Wendt, J.D. LLCM-WIP: Low-latency, continuous-motion walking-in-place. In Proceedings of the IEEE Symposium on 3D User Interfaces, Reno, NV, USA, 8–9 March 2008; IEEE Publications: Piscataway, NJ, USA, 2008; Volume 2008, pp. 97–104. [Google Scholar] [CrossRef]
  12. Calandra, D.; Lamberti, F.; Migliorini, M. On the usability of consumer locomotion techniques in serious games: Comparing arm swinging, treadmills and walk-in-place. In Proceedings of the 9th International Conference on Consumer Electronics (ICCE-Berlin), Berlin, Germany, 8–11 September 2019; IEEE Publications: Piscataway, NJ, USA, 2019; Volume 2019, pp. 348–352. [Google Scholar] [CrossRef]
  13. Kim, Y.; Baek, S.; Bae, B.C. Motion capture of the human body using multiple depth sensors. ETRI J. 2017, 39, 181–190. [Google Scholar] [CrossRef]
  14. Langbehn, E. Development and Evaluation of Interactive Locomotion User Interfaces. In Proceedings of the IEEE Virtual Reality (VR) (Doctoral Consortium), Greenville, SC, USA, 19–23 March 2016. [Google Scholar]
  15. Bu, S.; Lee, S. Easy to calibrate: Marker-less calibration of multiview azure Kinect. CMES Comput. Model. Eng. Sci. 2023, 136, 3083–3096. [Google Scholar] [CrossRef]
  16. Razzaque, S.; Swapp, D.; Slater, M.; Whitton, M.C.; Steed, A. Redirected walking in place. In Proceedings of the EGVE, Barcelona, Spain, 30–31 May 2002; Volume 2. [Google Scholar]
  17. Caserman, P.; Garcia-Agundez, A.; Konrad, R.; Göbel, S.; Steinmetz, R. Real-time body tracking in virtual reality using a Vive tracker. Virtual Real. 2019, 23, 155–168. [Google Scholar] [CrossRef]
  18. Cherni, H.; Nicolas, S.; Métayer, N. Using virtual reality treadmill as a locomotion technique in a navigation task: Impact on user experience—Case of the KatWalk. Int. J. Virtual Real. 2021, 21, 1–14. [Google Scholar] [CrossRef]
  19. Cakmak, T.; Hager, H. Cyberith virtualizer: A locomotion device for virtual reality. In Proceedings of the ACM SIGGRAPH 2014 Emerging Technologies, Vancouver, Canada, 10–14 August 2014; p. 1. [Google Scholar] [CrossRef]
  20. Muhammad, A.S.; Ahn, S.C.; Hwang, J.I. Active panoramic VR video play using low latency step detection on smartphone. In Proceedings of the IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 8–10 January 2017; IEEE Publications: Piscataway, NJ, USA, 2017; Volume 2017, pp. 196–199. [Google Scholar] [CrossRef]
  21. Ghonasgi, K.; Mirsky, R.; Haith, A.M.; Stone, P.; Deshpande, A.D. A Novel Control Law for Multi-Joint Human-Robot Interaction Tasks While Maintaining Postural Coordination. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1–5 October 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 6110–6116. [Google Scholar]
  22. Ghonasgi, K.; Mirsky, R.; Haith, A.M.; Stone, P.; Deshpande, A.D. Quantifying changes in kinematic behavior of a human-exoskeleton interactive system. In Proceedings of the 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 23–27 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 10734–10739. [Google Scholar]
  23. Ghonasgi, K.; Mirsky, R.; Bhargava, N.; Haith, A.M.; Stone, P.; Deshpande, A.D. Kinematic coordinations capture learning during human–exoskeleton interaction. Sci. Rep. 2023, 13, 10322. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Skeletal structure defined in Mocopi. The numbers are the indices of the joints, and the names of the joints are organized in Table 1.
Figure 1. Skeletal structure defined in Mocopi. The numbers are the indices of the joints, and the names of the joints are organized in Table 1.
Sensors 24 02848 g001
Figure 2. (a) Sensor-attached joint positions and (b) estimated joint positions where sensors are not attached. From the left wrist position marked in red in (a), the positions of the left shoulder and elbow, indicated by the red line in (b), are estimated.
Figure 2. (a) Sensor-attached joint positions and (b) estimated joint positions where sensors are not attached. From the left wrist position marked in red in (a), the positions of the left shoulder and elbow, indicated by the red line in (b), are estimated.
Sensors 24 02848 g002
Figure 3. Implementation of the proposed WIP technique with Mocopi.
Figure 3. Implementation of the proposed WIP technique with Mocopi.
Sensors 24 02848 g003
Figure 4. Exceptions to applying WIP.
Figure 4. Exceptions to applying WIP.
Sensors 24 02848 g004
Figure 5. WIP implementation process.
Figure 5. WIP implementation process.
Sensors 24 02848 g005
Figure 6. (a) Three-dimensional model based on captured motion data and (b) view of the user wearing the HMD, including their orientation and movement information.
Figure 6. (a) Three-dimensional model based on captured motion data and (b) view of the user wearing the HMD, including their orientation and movement information.
Sensors 24 02848 g006
Figure 7. Walking motion for right and left foot.
Figure 7. Walking motion for right and left foot.
Sensors 24 02848 g007
Figure 8. Three target paths: (a) butterfly-shaped, (b) Korean Peninsula-shaped, and (c) star-shaped.
Figure 8. Three target paths: (a) butterfly-shaped, (b) Korean Peninsula-shaped, and (c) star-shaped.
Sensors 24 02848 g008
Figure 9. User’s (a) path and (b) viewpoints.
Figure 9. User’s (a) path and (b) viewpoints.
Sensors 24 02848 g009
Figure 10. Plotted path and error.
Figure 10. Plotted path and error.
Sensors 24 02848 g010
Figure 11. Mocopi, treadmill, and target paths for a user who experienced the butterfly-shaped path.
Figure 11. Mocopi, treadmill, and target paths for a user who experienced the butterfly-shaped path.
Sensors 24 02848 g011
Figure 12. Mocopi, treadmill, and target paths for a user who experienced the Korean Peninsula-shaped path.
Figure 12. Mocopi, treadmill, and target paths for a user who experienced the Korean Peninsula-shaped path.
Sensors 24 02848 g012
Figure 13. Mocopi, treadmill, and target paths for a user who experienced the star-shaped path.
Figure 13. Mocopi, treadmill, and target paths for a user who experienced the star-shaped path.
Sensors 24 02848 g013
Table 1. Indexes and names of joints defined in Mocopi.
Table 1. Indexes and names of joints defined in Mocopi.
IndexJoint NameIndexJoint Name
0root14left_hand
1torso_115right_shoulder
2torso_216right_up_arm
3torso_317right_low_arm
4torso_418right_hand
5torso_519left_up_leg
6torso_620left_low_leg
7torso_721left_foot
8neck_122left_toes
9neck_223right_up_leg
10head24right_low_leg
11left_shoulder25right_foot
12left_up_arm26right_toes
13left_low_arm
Table 2. Target path extraction data.
Table 2. Target path extraction data.
PathButterflyKorean PeninsulaStar
CSV Vector3 index268123922067
Map size44 × 43 m34 × 58 m72 × 70 m
Table 3. Target path total distance (m).
Table 3. Target path total distance (m).
PathButterflyKorean PeninsulaStar
Path distance180160250
Table 4. ATEs for the butterfly-shaped path.
Table 4. ATEs for the butterfly-shaped path.
MocopiTreadmill
UserATEUserATE
user150.24user176.66
user263.73user270.79
user361.84user368.05
user478.35user469.99
user557.01user556.10
user653.75user665.81
user767.22user769.16
user859.66user853.81
Average61.48Average66.30
Standard deviation8.16Standard deviation7.18
Table 5. ATEs for the Korean Peninsula-shaped path.
Table 5. ATEs for the Korean Peninsula-shaped path.
MocopiTreadmill
UserATEUserATE
user198.38user149.83
user255.81user270.58
user370.24user353.18
user456.15user467.97
user561.45user558.91
user650.11user686.61
user767.84user761.78
user848.71user866.94
Average63.59Average64.48
Standard deviation14.98Standard deviation10.76
Table 6. ATEs for the star-shaped path.
Table 6. ATEs for the star-shaped path.
MocopiTreadmill
UserATEUserATE
user1117.85user170.45
user2102.85user2105.00
user384.02user3112.73
user496.10user4126.35
user580.26user5104.38
user6110.11user699.64
user779.94user7100.01
user874.48user875.48
Average93.20Average99.26
Standard deviation14.89Standard deviation17.21
Table 7. Survey results.
Table 7. Survey results.
Evaluation ElementsMocopiTreadmill
AverageStd. devAverageStd. dev
Immersion4.3750.4844.6250.484
Wearability4.6250.69540.707
Convenience4.50.53.6250.695
Reality4.1250.784.50.707
Difficulty4.3750.4844.1250.599
Responsiveness4.250.4334.6250.484
Connectivity3.8750.5994.50.5
freedom3.8750.59940.5
Spatiality4.750.4333.3750.484
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shin, R.; Choi, B.; Choi, S.-M.; Lee, S. Implementation and Evaluation of Walk-in-Place Using a Low-Cost Motion-Capture Device for Virtual Reality Applications. Sensors 2024, 24, 2848. https://doi.org/10.3390/s24092848

AMA Style

Shin R, Choi B, Choi S-M, Lee S. Implementation and Evaluation of Walk-in-Place Using a Low-Cost Motion-Capture Device for Virtual Reality Applications. Sensors. 2024; 24(9):2848. https://doi.org/10.3390/s24092848

Chicago/Turabian Style

Shin, Rawoo, Bogyu Choi, Sang-Min Choi, and Suwon Lee. 2024. "Implementation and Evaluation of Walk-in-Place Using a Low-Cost Motion-Capture Device for Virtual Reality Applications" Sensors 24, no. 9: 2848. https://doi.org/10.3390/s24092848

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop