Next Article in Journal
Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models
Previous Article in Journal
Adapting Geo-Indistinguishability for Privacy-Preserving Collection of Medical Microdata
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Physically Plausible Realistic Grip-Lift Interaction Based on Hand Kinematics in VR

1
Department of Computer Science, Hanyang University, Seoul 04763, Republic of Korea
2
Hanwha Systems, 188, Pangyoyeok-ro, Bundang-gu, Seongnam-si 13524, Republic of Korea
3
Department of Electrical and Software Engineering, University of Calgary, Calgary, AB T2N 1NA, Canada
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2023, 12(13), 2794; https://doi.org/10.3390/electronics12132794
Submission received: 26 May 2023 / Revised: 14 June 2023 / Accepted: 18 June 2023 / Published: 24 June 2023
(This article belongs to the Section Computer Science & Engineering)

Abstract

:
Immersive technology, refers to various novel ways of creating and interacting with applications and experiences, e.g., virtual reality (VR), has been used in various simulations and training where preparing real/physical settings is not ideal or possible, or where the use of virtual contents is otherwise beneficial. Realizing realistic interactions with virtual content is crucial for a quality experience and the effectiveness of such simulation and training. In this paper, we propose a kinematics-based realistic hand interaction method to enable a physically plausible grip-lifting experience in VR. The method reflects three kinematic characteristics of the hand: the force at contact points, finger flexion, and the speed of hand/finger motion, and we developed a grip-lift interaction prototype using the proposed method. To examine the sense of realism and hand poses during the grip-lift interaction, we conducted a human subjects experiment using the prototype, resulting in positive effects on the perceived realism and usefulness of the interaction. Grip-lifting is a fundamental interaction technique that is involved in most embodied interaction scenarios. Our method would contribute to the design and development of realistic virtual experiences, of which we will discuss the implications and potential based on our findings.

1. Introduction

While experiencing advances in the field of virtual reality (VR) research and development, various user interaction mechanisms have been proposed to effectively control virtual objects using different metaphors in immersive VR environments. Bare-hands interaction approaches are considered an effective form of a natural user interface in different use cases and applications, such as simulation, training, and games in VR [1,2,3,4,5,6,7,8]. Grip lifting is one of the most fundamental and important ways that people interact with virtual objects, e.g., to pick up and move things around as they want [9,10].
Although sophisticated and delicate interactions are possible in VR via three-dimensional movements of human hands, the diversity in hand gestures and motions also requires users to memorize and learn the interaction mechanisms in advance [11,12]. In addition, most bare-hand interactions do not consider the kinematic characteristics of hands in VR, e.g., how the user grabs and holds the object, or how much hand force is involved when they perform grip-lift actions. There was some research that tried to effectively mimic the hand interactions in the real world considering kinematic characteristics [2,13], but the studies mostly focused on (or were limited to) the consideration of touch points between the hand and the target objects, which could still cause a lack of realism.
In this paper, we aim to achieve more realistic bare-hand grip-lift interactions by considering three different, yet related, kinematic characteristics of the hand to manifest the actual movement of the human hand: (1) forces at contact points, (2) finger flexion, and (3) instantaneous speed during the grip-lift interaction. Here, the force at contact point indicates the force generated on each of the contact points between the hand surface (fingers and palm) and the touching object. The finger flexion considers the degree of bending fingers (or hand in general), and the instantaneous speed reflects how fast the hand moves at the moment when it grabs the target object. We model the final force of the hand during a grip-lift interaction using these three factors, which are justified based on previous hand interaction research in kinematics and robotics [14,15,16]. We should note that our goal here is to mimic the realistic grip lifting experience, not for ensuring the success of grip-lift interaction. In other words, users may or may not be able to lift virtual objects depending on how much force they use to perform the grip-lift task. Figure 1 shows the general concept of the proposed method.
To verify the validity of the proposed grip-lift method, we conducted a human subjects study, where participants were tasked to compare their lifting experience in VR using our kinematics-based method with the experience in a real situation. The participants’ perceived realism of the lifting experience in VR and the similarity of their hand poses during the grip-lift interactions were examined. Our research contributes to the realistic bare-hand interaction literature in the following aspects:
C1: 
We model a more realistic kinematics-based virtual hand force considering different kinematic characteristics of a real hand.
C2: 
We prototype a novel kinematics-based bare-hand interaction using the proposed model.
C3: 
We show the validity of the proposed method through a human subjects study.
The rest of the paper consists of the following sections. Section 2 describes previous research related to hand-based grip-lift interactions, particularly focusing on the characteristics of a human hand that we adopted in our method. Section 3 explains our proposed hand force model for physically plausible grip-lift interactions, and also describes a developed prototype using the model. We present a human subjects experiment to evaluate the proposed method/prototype in Section 4. The results are presented in Section 5, and the findings and implications are discussed in Section 6. Finally, Section 7 outlines the research conclusions.

2. Related Work

This section describes previous research related to our kinematics-based realistic hand interaction, with respect to three kinematic characteristics of hand/fingers: finger force, finger flexion, and speed of hand/finger motion.

2.1. Finger Force in Grip-Lift Interactions

In the field of VR research, previous research suggested physics-based methods to simulate and assess hand interactions with virtual objects [2,9,13,17,18]. There were some studies that enabled the grip-lift interaction with virtual hands, mostly based on the Coulomb friction model. The model, however, considers tangential forces at the contact points on a target lifting object’s surface regardless of the form of the object or posture of hands [2,13]. Nasim and Kim [13] introduced a physics-based grasping algorithm, and applied undifferentiated force only to the fingertips contacting the virtual object. Another study by Höll et al. [2] calculated the forces at contact points simply based on the distance from the finger joint locations. Since that distance ratio is used within a finger joint, there is no difference in the general force generated for each finger joint. However, if the same force is always considered at the internal part of hands regardless of the pose and the force of the hand, grip lifting could be implausibly successful even in situations where the lifting should not (or cannot) be possible in reality, which could break the user’s sense of presence/realism in the virtual environment.
Previous hand kinematics studies have identified force differences across the parts of the fingers when lifting an actual object with hands [14,19,20]. Moreover, those studies identified the force difference in contact points where the palm and fingers touch an object when trying to grab the object [14,19,20]. Especially, the study by Kargov et al. [14] determined the force difference across fingers in detail using the Dyna-8 force measuring system in the descending order of the thumb, index finger, middle finger, ring finger, and little finger. Being inspired by these studies reflecting the detailed force of each finger in grip-lift interaction, our model for realistic grip-lift interactions adopted different hand force distributions.

2.2. Finger Flexion in Grip-Lift Interactions

Previous studies related to hand anatomy and kinematics have shown that flexor muscles located on the fingers and palm are responsible for the flexion of the wrist and fingers. Not surprisingly, those muscles have an important effect on the grip strength of fingers and palm [21,22,23,24]. According to previous kinematic studies, hand grasp motion is divided primarily into power, precision, and intermediate (optional) grasps [25,26,27]. Based on this classification, it was confirmed that the force varies depending on the curvature of the area touching the object by the connected nodes of each finger [26]. In addition, Hu et al. [28] identified that greater force is engaged when the fingers are in a bent position. This means that the actual hand can take a different grip form due to the flexion of the hand, and the force is generated differently.
Some robotics studies enabled object lifting interactions using the flexion movement of fingers of (humanoid) robots [15,16,29]. They described that the pressure generated from bending the hand in this process increased to produce a high-pressure force [15,16]. Furthermore, evaluating the extent of finger movement angle is essential to reflect the angle difference of the flexion of finger joints. A review by Lee et al. [30] explored the function of actual hands from the perspective of bio-dynamics, such as kinematics and dynamics based on previous studies. The section on finger movement studied was summarized to identify the mean maximum flexion angles [30], which we also used in our method.

2.3. Motion Speed in Grip-Lift Interactions

In previous studies, hand interactions were modeled to reflect the speed of the entire hand in the force generated by the hand [2]. However, the speed of the entire hand or of the consequently moving virtual object may not be sufficient to reflect the force from the user’s hand. In some robotics studies, the speed of each finger joint is applied biologically by the grip force of the hand connected to the forearm muscle [31,32]. Iyengar et al. [31] confirmed that the moving speed of the arm affects the grip force of the fingers. In another study by Tigue et al. [32], a finger tendon model and its simulation were developed using the Bond graph modeling method. Biomechanical modeling was used, which reflected each finger’s speed based on the movement of fingers by the tendon initiating from the forearm connected to the hand [32]. Salisbury et al. [33] used a basic kinematics concept regarding the mechanical hand for a method to determine the relationship between the force applied on an object when grabbing it using fingers and the speed change of the fingers. Given these preceding studies in different fields, our research considers the speed of the contact points that touch a virtual object and reflects the power of the hand to interact with the object, which we will describe more in the following sections.

3. Methods and Development

To achieve realistic grip-lift interactions in VR, we propose a hand force model for bare-hand interactions that reflects the kinematic characteristics of the hand. We also develop an interaction prototype using the proposed model as shown in Figure 1. This section will describe the details of our method and specifications for the prototype.

3.1. Kinematics-Based Hand Force Model

For a realistic hand force model, we consider the forces at contact points, which touch the target object’s surface, the flexion of the finger, and the instantaneous speed of the hand when the object is grip-lifted. In our model, the force at a contact point ( F c ) is defined with different weights considering the size and strength, including friction of different finger parts; e.g., a thumb has more strength than a little finger, whereas previous studies often applied the same weights consistently [2]. The finger flexion ( B c ) represents how tightly the user grabs the object, and we define it as a ratio of the degree of bending on a finger joint to the maximum degree that the joint can have. We apply the finger flexion affection to our model as a coefficient value based on previous hand kinematics studies that determined a tendency that finger forces are dependent on finger flexion when gripping through various experiments [34,35] and used linear regression analysis to calculate the real hand force [36]. Finally, the instantaneous speed of the hand ( S c ) is derived from the movement speed of the finger parts at the moment of lifting, in which the factor is also inspired by results of previous hand kinematics research that showed finger movement speed accelerated to hand force [37]. Therefore, we simply affect the factors in the formulation as a coefficient value.
By multiplying these three factors, we calculate the final force considered in kinematics at the single contact point ( F k ; Equation (1)). We only consider contact points that refer to the points where the hand (or fingers) touches the virtual object. Finally, the hand force (F) is calculated by adding all the final force at (n) contact points touching the virtual object (Equation (2)). Here, we consider the flexion ratio and the speed as weight values, and the unit of the final hand force (F) is Newton (N).
F k = F c B c S c
F = i = 1 n ( F k ) i
The final hand force will be used to determine whether the user can actually lift the virtual object or not, depending on the weight of the object. For example, a threshold (T) can be calculated based on the object’s mass (m) and the gravity (g) (Equation (3)), and the unit of this threshold is the same as Newton (N: kg·m/s2). By comparing the user’s hand force with this threshold, the grip-lift interaction can be performed successfully or not.
T = m g
if F T , succeed to lift the virtual object if F < T , fail to lift the virtual object
The details of the three kinematic factors that influence the hand force will be described below.

3.1.1. Force at Contact Point ( F c )

Inspired by previous hand kinematic research [14,38], our method assumes that the force applied to an object will be related to the area of a user’s hand touching the object. Thus, we only consider the contact points of the hand/fingers to calculate the force of the finger parts. We first divide the hand into different parts that could contact the surface of the virtual object during grip lifting; for example, 51 possible contact points—three for each finger part, which is separated by the finger joint, depending on the touching angle (3 × 15), and six on the palm (see yellow boxes in Figure 2). In our method, the center of the finger part is considered as a contact point to the lifting object. Since each finger has different strengths due to the skeletal/muscle structure, size, and shape, we assign different weights to some of the finger parts in VR similar to real fingers, which is inspired by a previous kinematics study [14]. The parts on the thumb, the parts on the index and middle fingertips, and the metacarpophalangeal joint of the index finger (16 parts in total), which are marked with red circles in Figure 2, are assumed to have twice stronger force compared to other parts.
The force at a contact point is calculated by simplifying the tangential force in the same way as in the previous study, assuming that the friction coefficient and the normal vector are constant [2]. The force of the finger part touching the object, i.e., contact point, ( F c ), is calculated by multiplying the general force of each point ( F g ), friction coefficient considering surface material ( μ ), and normal vector considering force direction ( N v ):
F c = μ N v F g
A general force of each contact point ( F g ) can be estimated by dividing the known/assumed baseline hand grip force by the number of parts that we have. For example, if a male adult has 300 N of average grip force, the general force of each part would be 4.5 N; the parts with red circle marks are 9 N due to the assigned weight. This value can be adjusted depending on the user’s profile; e.g., women and children might have less strong grip force compared to adult men.
While previous research mostly considered one section per finger part/joint for computational simplicity, or defined all the area of the finger part as the contact points, which required additional calculations [2,39], our method reflects a three-dimensional form of the hand in detail by dividing each finger part into three subsections: left, right, and middle parts. Moreover, previous studies assigned the same general force values ( F g ) to all the contact points regardless of their positions on the hand, which could limit the adaptability of the method [2,19,20]. However, our method could apply different general forces to the contact points considering the kinematic characteristics of the real hand and fingers.

3.1.2. Finger Flexion ( B c )

Another important factor that we consider to model the hand force is finger flexion—how much a user bends their fingers to grip an object. The more the finger bends, the more tightly the object is gripped even in the same general hand force [28]. The user would exert more force when gripping the object tightly, so we assume that the force on the contact point would be proportional to the degree of finger flexion. Therefore, we define the finger flexion value ( B c ) as a ratio of the degree of current flexion of the finger part to the maximum degree that the part can make. The formula for this finger flexion value is as follows:
B c = 1 + ( A c / A m a x )
where A c is the angle that shows how bent the contact point is and A m a x is the maximum angle to which the contact point can bend. Considering the case when the fingers are fully stretched, and its impact in Equation (1), we add 1 as a default value for the finger flexion; thus, the range of the flexion value would be from 1 to 2—a fully stretched finger and a fully bent finger, respectively. Figure 3 illustrates the flexion angles of fingers as an example.

3.1.3. Instantaneous Speed ( S c )

The muscles in our arms, hands, and fingers will be engaged when we grip and lift an object up by influencing all the joints and changing the location of body parts [31,40]. In other words, the force that we exert using the muscles will directly influence the speed of the separated finger movement during the grip-lift action [20]. Not surprisingly, this speed is an important clue to estimating the finger force, which eventually becomes hand force. Therefore, our method considers the finger’s instantaneous speed ( S c ) when the user grips and lifts up the object in our hand force model. The formula used for the instantaneous speed of the contact points on the hand is shown below.
S c = 1 + ( D / t )
where D indicates the distance between the contact points (or the finger parts) where the hand touches the objects, and t is the time that takes to reach the distance after the object is gripped. Since we consider the instantaneous speed of the motion, we can empirically set a relatively small value to t, e.g., 0.1 s. Similar to the default value of 1 for finger flexion in the formula, we also set 1 for the instantaneous speed as a default value considering the case when there is no movement of the hand. The original unit for D / t is m/s, but we consider this as a weight value, so the unit can be ignored. The range of S c would also be roughly from 1 to 2 assuming that the user would move their hand less than 10 cm for 0.1 s when they first grab the object.

3.2. Prototype Development

Given the hand force model proposed in Section 3.1, we developed a physically plausible grip-lift interaction prototype in an immersive VR environment. We aim to provide a realistic grip-lift experience to the users, possibly with different virtual objects.
HTC Vive Pro was used to render the virtual environment and the grip-lift simulation with virtual objects to lift. We used a virtual hand model provided in Epic Games’ Unreal Engine 4.25 (Epic Games’ Unreal Engine: https://www.unrealengine.com/ (accessed on 6 April 2022)) and developed an interface for the user to move their virtual hands in the VR environment while they were moving their own hands in the real world. HTC Vive Hand Tracking SDK (HTC Vive Hand Tracking SDK: https://developer.vive.com/resources/vive-sense/hand-tracking-sdk/ (accessed on 6 April 2022).
The contact points on the hand were placed based on the finger part description in Section 3.1.1, which consisted of 51 possible contact points with different weights (see Figure 2). Each of these finger parts included a collision detector so that we could identify which parts actually touched the virtual object during the grip-lift interactions. According to the study by Kargov et al. [14], the total force of a hand is approximately 16.7 N, which we used in our prototype considering a relatively light object to lift, e.g., 1–2 kg. Dividing the total force by the number of possible contact points (67 on a hand, counting the double-weighted points as two), the general force of each contact point could be set at 0.25 N. We assumed constant values for the surface material friction (e.g., μ = 1 ) and the normal vector (or the touching direction, N v = 1 ) for simplicity of the interaction [41]. For the finger flexion, the angles of finger joints except for the palm ( A c = 0 ) were calculated similarly to the previous study by Wang et al. [42], and the maximum flexion value was adopted from the result in Lee et al. [30]. We measured the instantaneous speed of the hand while setting the time at 0.1 s empirically to consider the momentary speed.

4. Experiment

We conducted a human subjects experiment to examine whether our proposed method (and prototype) could provide VR users with a realistic grip-lift experience similar to the actual grip lifting in reality. The details of the experiment are described in this section.

4.1. Participants

We recruited 15 participants (male: 12, female: 3; age M: 26.47, SD: 3.34) from our university community. All of the participants had normal or corrected-to-normal vision. None of the participants reported known visual or vestibular disorders, such as color blindness, dyschromatopsia, or a displacement of balance. All the participants were right-handed, which they used during the grip-lift interaction tasks in the experiment. They participated in this study voluntarily without monetary compensation. This study was conducted during the COVID-19 pandemic and followed all necessary guidelines required by our institute.

4.2. Study Design

A within-subjects experiment was designed where the participants performed the same grip-lift interactions in an immersive virtual environment while varying the conditions in two ways:
  • Limited Kinematics (LK): Equivalent finger forces were applied for all the contact points. In other words, the grip-lift interaction was simulated only based on the number of contact points without considering the force weights, flexion, and speed.
  • Full Kinematics (FK): All three kinematic characteristics were applied to simulate the grip-lift interaction as we proposed in Section 3.
Based on our assumption that the consideration of multiple kinematic characteristics will improve the realism of the grip-lift interaction in VR, we established the following hypotheses with respect to the user’s subjective perception and objective hand motion:
H 1.
Participants’ perceived realism of the grip-lift interaction in the FK condition will be higher than in the LK condition.
H 2.
Participants would feel the grip-lift interaction in the FK condition is more useful in realistic simulation/training than in the LK condition.
H 3.
Participants’ hand poses during the grip-lift interaction in the FK condition will be more realistic (e.g., similar to the actual hand poses in reality) compared to the poses in the LK condition.

4.3. Task and Settings

We wanted our participants to compare their grip-lift experience in VR with the actual grip-lift in reality; thus, we first let them experience the real grip-lift in the real world with a real object. Considering the diversity of the participant population, e.g., female and male with different hand strengths, we prepared a relatively light object, a 1.8 kg cylinder bar with a curved surface. The participants were informed of the object weight, and they were asked to use their dominant hand to perform the grip-lift task (see Figure 4a). We situated six grip lifting scenarios to examine participants’ perceptions and hand motions during grip-lift interactions. The scenarios considered the three kinematic characteristics that we reflected in our hand force model: (1) force at the contact point, (2) finger flexion, and (3) instantaneous speed. Two situations were prepared relating to each kinematic factor. For example, participants were guided to use their thumb and index finger (high weight), or their thumb and little finger (low weight), for the task related to finger force so that they could have opportunities to use different levels of hand force during the tasks. Similarly, participants were asked to try bending their fingers differently and moving them at different speeds regarding the finger flexion and speed (see Table 1 for details).
After the grip-lift task with the real object as a baseline experience, participants had the same six grip lifting scenarios in VR without the real object to compare their virtual grip lifting experience with the real object (see Figure 4b). By comparing the real object baseline setting and the test setting in VR, participants could evaluate how similar the experience in VR was to the experience in reality.

4.4. Procedure

When participants arrived, we explained the general goal of the research to compare the grip-lift interaction in reality and VR, and the details of the experiment. Once the participant agreed to participate in the experiment, they were guided to the experimental space where they performed the grip-lift tasks described in Section 4.3. Once they finished the real grip-lift task with the real object as a baseline experience, we asked them to remember how they felt about their grip-lift experience with the real object so that they could compare it with the experience in VR with the virtual object. After that, we had them don an HTC Vive Pro headset to start the virtual grip-lift experience, which consists of six different situations (see Table 1). We had the participants perform the grip-lift interactions at least two times for each situation so that they could have sufficient experience. At the end of each situation, the participants answered a questionnaire about their perception of the grip-lift experience in VR compared to the experience in reality. Participants experienced both experimental conditions: LK and FK, in which we varied the level of kinematic factors used in the hand force model to simulate the virtual grip-lift interaction (see Section 4.2). The experiment was finished after the participant completed all the virtual grip-lift interactions.

4.5. Measures

To examine the realism of the virtual grip-lift interaction using our proposed method, we used both subjective and objective measures.

4.5.1. Subjective Measures

To measure participants’ perceptions of the virtual grip-lift interaction, we prepared a questionnaire with four questions, which are about the perceived realism and usefulness of the experienced grip-lift interaction in VR. All the questions were on a 5-point Likert scale (1: strongly disagree; 5: strongly agree).

Perceived Realism

Participants answered three questions about their perceived realism in the grip-lift interaction that they had in VR compared to the previous experience with the real object (see Q1–Q3 in Table 2). Those questions covered different aspects of perceived realism: the lifting result, the engaged hand force, and the hand poses.

Usefulness

Participants also answered a question about the usefulness of the system/interaction to mimic (or replace) the reality considering the potential use of the virtual grip-lift interaction in training and simulation contexts (see Q4 in Table 2).

4.5.2. Objective Measure

To objectively examine the realism of the grip-lift interaction in VR, we measured participants’ hand poses while comparing them with the actual hand poses with the real object.

Hand Pose Similarity

The similarity of hand poses in VR to the poses in the real world during the grip-lift interaction was evaluated quantitatively to compare the similarities between the experimental conditions, FK and LK. In other words, the similarity between the hand poses in the FK condition and the poses in the baseline real setting was compared with the similarity between the hand poses in the LK condition and the real setting. To obtain the exact hand pose from a third-person perspective, OpenPose [43] was used to extract 21 keypoints of hand at a fixed position in the study environment; for this, one Logitech HD 1080 p camera was placed facing toward the participant to capture the pose of the hand while avoiding any occlusions (see Figure 5). OpenPose has been widely used to compare such similarities of hand poses or gestures [44,45].
The participant was asked to pause for about 2 s before and after touching the real or virtual object. The similarity of the hand pose was calculated using the frames in between. The dynamic time warping (DTW) method, which is commonly used to recognize human poses and evaluate the similarity [44,45], was used to calculate the similarity scores in our experiment by comparing the frames in between. Further, 21 keypoints detected were converted to vectors, and the DTW distance was calculated based on those keypoints coordinate sequences. The third-person perspective camera and the similarity score could solve some issues, e.g., video frame comparisons when the operation speed of motion is different or the points in each video are different when it is calculated using the Euclidean distance. The similarity score ranges from 0% (not similar at all) to 100% (exactly the same).

5. Results

In this section, we report the results of the realism measures, i.e., the subjective and objective similarities of the grip-lift experience in the FK or LK condition to the real grip-lift with the real object.

5.1. Subjective Measures

Due to the ordinal data type of the questionnaire responses and sample size, we used nonparametric Wilcoxon signed-rank tests to compare the perceived realism between the FK condition and the LK condition, with a significance level at α = 0.05.

5.1.1. Perceived Realism

In general, the reported scores in the FK condition for the three questions (Q1–Q3 in Table 2) related to the perceived realism were higher than the scores in the LK condition with statistical significance. As a high-level result, we combined all the responses for Q1–Q3 using a mean value as a representative realism perception score (Cronbach’s alpha: 0.843 ) and compared the scores between the LK and FK conditions.
The result showed that the realism score in FK ( M d n = 3.92 ) was significantly higher than the score in LK ( M d n = 3.11 ); Z = 8.549 , p < 0.001, r = 0.52 (a large effect based on Cohen’s classification in Wilcoxon signed-rank tests [46]) (see Figure 6). This means that our proposed method with three kinematic characteristics of the hand could provide a more realistic grip-lift experience compared to the method that only considered the static finger force.
To understand the detailed effects, we further analyzed the reported scores among the different interaction scenarios per individual question. The details of the results are shown in Figure 6 and reported below.
For Q1 related to the perceived realism of lifting result, we found statistically significant differences in all three interaction scenarios.
  • Finger force (A and B in Table 1): FK > LK; Z = 2.153 , p = 0.031, r = 0.39 (effect size: moderate to large).
  • Finger flexion (C and D in Table 1): FK > LK; Z = 3.623 , p < 0.001, r = 0.66 (effect size: very large).
  • Instantaneous speed (E and F in Table 1): FK > LK; Z = 2.013 , p = 0.044, r = 0.36 (effect size: moderate to large).
On average for all these three interaction scenario categories, the score in the FK condition ( M d n = 3.77 ) was higher than the score in the LK condition ( M d n = 2.91 ), with statistical significance; Z = 4.619 , p < 0.001, r = 0.49 (effect size: moderate to large) (see Figure 6—Q1). This indicates that participants felt the result of the grip-lift interaction in the FK condition was more similar to the real grip-lift interaction with the real object compared to the LK condition.
Similarly, for Q2 related to the perceived realism of the engaged hand force, we also found statistically significant differences in all three interaction scenarios.
  • Finger force (A and B in Table 1): FK > LK; Z = 3.112 , p = 0.002, r = 0.57 (effect size: large).
  • Finger flexion (C and D in Table 1): FK > LK; Z = 3.186 , p = 0.001, r = 0.58 (effect size: large).
  • Instantaneous speed (E and F in Table 1): FK > LK; Z = 3.030 , p = 0.002, r = 0.55 (effect size: large).
Again, for the average of all three interaction scenarios, the score in the FK condition ( M d n = 3.92 ) was higher than the score in the LK condition ( M d n = 3.02 ) with statistical significance; Z = 5.365 , p < 0.001, r = 0.56 (effect size: large) (see Figure 6—Q2). This indicates that participants felt the level of the engaged force in the FK condition was more similar to the force in the real grip-lift interaction with the real object compared to the LK condition.
Finally, for Q3 related to the perceived realism of the hand pose, there were statistical significances in all three interaction scenarios.
  • Finger force (A and B in Table 1): FK > LK; Z = 3.123 , p = 0.002, r = 0.57 (effect size: large).
  • Finger flexion (C and D in Table 1): FK > LK; Z = 3.207 , p = 0.001, r = 0.59 (effect size: large).
  • Instantaneous speed (E and F in Table 1): FK > LK; Z = 2.592 , p = 0.010, r = 0.47 (effect size: moderate to large).
Again, for the average of all three interaction scenarios, the score in the FK condition ( M d n = 4.08 ) was higher than the score in the LK condition ( M d n = 3.39 ) with statistical significance; Z = 5.004 , p < 0.001, r = 0.53 (effect size: large) (see Figure 6—Q3). This means that participants felt their hand poses in the FK condition were more similar to the poses in the real grip-lift interaction with the real object compared to the LK condition.

5.1.2. Usefulness

The results of Q4 (Usefulness) showed that the score in FK ( M d n = 3.77 ), which considered three kinematic characteristics of hand, was higher than the score in LK ( M d n = 3.00 ); Z = 4.738 , p < 0.001, r = 0.50 (effect size: large) (see Figure 7). For the details in different interaction scenarios, we found statistically significant differences in all three scenarios.
  • Finger force (A and B in Table 1): FK > LK; Z = 2.567 , p = 0.010, r = 0.47 (effect size: moderate to large).
  • Finger flexion (C and D in Table 1): FK > LK; Z = 3.145 , p = 0.002, r = 0.57 (effect size: large).
  • Instantaneous speed (E and F in Table 1): FK > LK; Z = 2.495 , p = 0.013, r = 0.46 (effect size: moderate to large).
This indicates that our kinematics-based method was perceived to be more useful for simulating real grip-lift interactions compared to the method that only considered static finger force.

5.2. Objective Measure

As an objective measure to examine the realism of the grip-lift interaction using our proposed method, we analyzed participants’ hand poses when they performed the grip-lift interaction in VR, comparing them with the poses with the real object.

Hand Pose Similarity

We compared each participant’s hand pose similarities to the baseline real setting between the FK condition and the LK condition. Figure 8 shows the details of the similarity scores in all three interaction scenarios, which were described in Table 1. In the interaction scenarios related to finger force, more participants showed hand poses similar to the real grip-lift situation in the FK condition (A: 11 participants, B: 10 participants) compared to the LK condition (A: 4, B: 5) (see Figure 8a). However, the number of participants who showed hand poses similar to the real situation was almost the same between the FK condition (C: 7, D: 7) and the LK condition (C: 8, D: 8) in the scenarios related to finger flexion (see Figure 8b). In the scenarios related to instantaneous speed, more participants in the FK condition (E: 12, F: 10) appeared again, showing similar hand poses to the real situation compared to the LK condition (E: 3, F: 5) (see Figure 8c). In general, more participants made similar hand poses to the real situation in the FK condition than in the LK condition; however, the effect did not appear to be strong.

6. Discussion

Through the analysis of subjective measures, our results show that the participants felt that the grip-lift interaction using our method (FK), which adopts three kinematic characters of hand, was more similar to the real grip-lift interaction (i.e., realistic) compared to the method that adopts the static finger force only (LK). Participants particularly mentioned that they occasionally experienced more difficulty to lift the virtual object (or even failed to lift it) in the FK condition. This difficulty (or failure) of grip lifting could actually make the experience perceived more realistic because lifting such an object would/should not be easy in reality when loosely grabbing it. These results support our hypothesis (H1) about the higher realism using our method.
The results also show that participants perceived the proposed method would be more useful in realistic simulation and training, supporting our hypothesis (H2).
Most simulation and training experiences in immersive virtual environments, e.g., virtual training for construction workers, involve or aim to have dynamic hand interactions with virtual objects, e.g., selecting, grabbing, lifting, or moving. Realistic/natural hand interaction is one of the key factors to render realistic, effective simulation and training in VR, without breaking the sense of presence [47]. Our kinematics-based hand interaction in the grip-lift context would be an effective solution to create such realistic training environments.
Regarding our hypothesis (H3) pertaining to objective hand pose similarity, the numbers of participants who displayed more realistic hand poses—in other words, more similar to the grip-lift hand poses with the real object—were higher in the FK condition than in the LK condition for certain scenarios, e.g., related to finger force and instantaneous speed. However, the differences are not huge; thus, we would not draw any strong conclusions for the hypothesis at this point. This might be because the participants mostly relied on visual cues, e.g., the virtual object’s size and shape, which were provided equally in the FK and LK conditions, when grabbing the object.
Interestingly, given the results of the subjective measure in Q3 regarding the perceived pose realism, the participants thought their hand poses in the FK condition were more realistic than the LK condition, although we did not necessarily find such realism in the objective hand pose analysis. This is particularly interesting because it shows that the user’s perception can be positively influenced by our method while not changing their actual motions. The absence of the physical lifting object in both FK and LK could probably have mitigated the effects of our method on realistic hand poses.
Our experiment showed positive effects of the proposed method on perceived realism and usefulness, but we should also admit some potential limitations, which lead us to future research directions. Although the proposed model is indeed generalizable to different participant profiles, e.g., male, female, adult, and children with different hand force levels, we set certain values in the model for the simplicity of our experiment. Further research with various participants and different shapes and sizes of virtual objects should be conducted to consolidate our findings, which we are currently planning with a larger sample size.
Some might observe that the proposed method is too simple to mimic the diversity of real hand poses and force. However, we would like to highlight that it is our intention to generate a simple yet effective model, which can be constructed around the kinematic parameters, and prove its usefulness in practice. Future research towards more complex force models and sophisticated optimization is still open while considering multimodal inputs beyond visual cues, such as electromyographic (EMG) signals.

7. Conclusions

In this paper, we proposed a novel kinematics-based grip-lift hand interaction method and developed a prototype using the method. The hand force model in our method considered finger forces at contact points, finger flexion, and instantaneous speed to realize a physically plausible realistic grip-lift experience in VR. Our experiment showed that the kinematics-based method could improve the sense of realism during grip-lift interactions. The findings would be beneficial for designing and developing more realistic virtual experiences, which require natural and realistic hand interactions. We discussed possible limitations of our study with the limited sample and the simple hand force model, but also highlighted the strengths in practice. To address such limitations, we currently plan follow-up studies with more participants and different object and interaction settings, and further research on modeling and developing realistic hand interactions in VR is encouraged.

Author Contributions

Conceptualization, H.N. and J.-I.P.; methodology, H.N. and J.-I.P.; software, H.N. and C.K.; validation, K.K. and J.-I.P.; formal analysis, H.N., C.K. and K.K.; investigation, H.N. and C.K.; resources, J.-I.P.; data curation, H.N. and C.K.; writing—original draft preparation, H.N., C.K. and K.K.; writing—review and editing, H.N., K.K. and J.-I.P.; visualization, H.N.; supervision, K.K. and J.-I.P.; project administration, J.-I.P.; funding acquisition, K.K. and J.-I.P. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT; Ministry of Science and ICT) (Nos. 2020R1G1A1102709, RS-2023-00222776) and Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2020-0-00452, Development of Adaptive Viewer-centric Point cloud AR/VR (AVPA) Streaming Platform). We also acknowledge the support of the Natural Sciences and Engineering Research Council of Canada (NSERC), [RGPIN-2022-03294].

Institutional Review Board Statement

Ethical review and approval were not required for the study on human participants in accordance with local legislation and institutional requirements.

Informed Consent Statement

Informed consent was verbally obtained from all subjects involved in the study.

Data Availability Statement

The data are not publicly available due to privacy or ethical restrictions.

Acknowledgments

The authors would also like to acknowledge members of the Mixed Reality Lab at Hanyang University.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Karam, M. A Framework for Research and Design of Gesture-Based Human-Computer Interactions. Ph.D. Thesis, University of Southampton, Southampton, UK, 2006. [Google Scholar]
  2. Höll, M.; Oberweger, M.; Arth, C.; Lepetit, V. Efficient physics-based implementation for realistic hand-object interaction in virtual reality. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 175–182. [Google Scholar] [CrossRef] [Green Version]
  3. Kang, H.J.; Shin, J.h.; Ponto, K. A comparative analysis of 3d user interaction: How to move virtual objects in mixed reality. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 275–284. [Google Scholar] [CrossRef]
  4. Chessa, M.; Maiello, G.; Klein, L.K.; Paulun, V.C.; Solari, F. Grasping objects in immersive Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1749–1754. [Google Scholar] [CrossRef]
  5. Shi, R.; Zhang, J.; Yue, Y.; Yu, L.; Liang, H.N. Exploration of Bare-Hand Mid-Air Pointing Selection Techniques for Dense Virtual Reality Environments. In Proceedings of the Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–29 April 2023. [Google Scholar] [CrossRef]
  6. Juan, M.; Elexpuru, J.; Dias, P.; Santos, B.S.; Amorim, P. Immersive virtual reality for upper limb rehabilitation: Comparing hand and controller interaction. Virtual Real. 2023, 27, 1157–1171. [Google Scholar] [CrossRef] [PubMed]
  7. Dube, T.J.; Arif, A.S. Ultrasonic Keyboard: A Mid-Air Virtual Qwerty with Ultrasonic Feedback for Virtual Reality. In Proceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction, Warsaw, Poland, 26 February–1 March 2023. [Google Scholar] [CrossRef]
  8. Wang, Y.; Hu, Z.; Yao, S.; Liu, H. Using visual feedback to improve hand movement accuracy in confined-occluded spaces in virtual reality. Vis. Comput. 2023, 39, 1485–1501. [Google Scholar] [CrossRef]
  9. Delrieu, T.; Weistroffer, V.; Gazeau, J.P. Precise and realistic grasping and manipulation in Virtual Reality without force feedback. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 266–274. [Google Scholar] [CrossRef]
  10. Jiang, H.; Liu, S.; Wang, J.; Wang, X. Hand-Object Contact Consistency Reasoning for Human Grasps Generation. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, BC, Canada, 11–17 October 2021; pp. 11107–11116. [Google Scholar] [CrossRef]
  11. Buchmann, V.; Violich, S.; Billinghurst, M.; Cockburn, A. FingARtips: Gesture based direct manipulation in Augmented Reality. In Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, Singapore, 15–18 June 2004; pp. 212–221. [Google Scholar] [CrossRef]
  12. Moehring, M.; Froehlich, B. Effective manipulation of virtual objects within arm’s reach. In Proceedings of the 2011 IEEE Virtual Reality Conference, Singapore, 11–13 March 2011; pp. 131–138. [Google Scholar] [CrossRef]
  13. Nasim, K.; Kim, Y.J. Physics-based interactive virtual grasping. In Proceedings of the HCI Korea, Kangwon-do, Republic of Korea, 27–29 January 2016; pp. 114–120. [Google Scholar] [CrossRef] [Green Version]
  14. Kargov, A.; Pylatiuk, C.; Martin, J.; Schulz, S.; Döderlein, L. A comparison of the grip force distribution in natural hands and in prosthetic hands. Disabil. Rehabil. 2004, 26, 705–711. [Google Scholar] [CrossRef] [PubMed]
  15. Al-Fahaam, H.; Davis, S.; Nefti-Meziani, S.; Theodoridis, T. Novel soft bending actuator-based power augmentation hand exoskeleton controlled by human intention. Intell. Serv. Robot. 2018, 11, 247–268. [Google Scholar] [CrossRef] [Green Version]
  16. Wu, L.; Jung de Andrade, M.; Saharan, L.K.; Rome, R.S.; Baughman, R.H.; Tadesse, Y. Compact and low-cost humanoid hand powered by nylon artificial muscles. Bioinspir. Biomim. 2017, 12, 26004. [Google Scholar] [CrossRef] [PubMed]
  17. Zhou, G.; Lu, M.L.; Yu, D. Investigating gripping force during lifting tasks using a pressure sensing glove system. Appl. Ergon. 2023, 107, 103917. [Google Scholar] [CrossRef]
  18. Han, D.; Lee, R.; Kim, K.; Kang, H. VR-HandNet: A Visually and Physically Plausible Hand Manipulation System in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2023. [Google Scholar] [CrossRef]
  19. Bretz, K.J.; Jobbágy, Á.; Bretz, K. Force measurement of hand and fingers. Biomech. Hung. 2010, 3, 61–66. [Google Scholar] [CrossRef]
  20. Kinoshita, H.; Kawai, S.; Ikuta, K.; Teraoka, T. Individual finger forces acting on a grasped object during shaking actions. Ergonomics 1996, 39, 243–256. [Google Scholar] [CrossRef]
  21. Biju, S.M.; Motti, B.; Malek, M.F.; Oroumchian, F.; Bell, A. Design of hand grip system with focus on tripod grip strength. Int. J. Eng. Trends Technol. 2020, 68, 28–37. [Google Scholar] [CrossRef]
  22. Richards, L.G.; Olson, B.; Palmiter-Thomas, P. How forearm position affects grip strength. Am. J. Occup. Ther. 1996, 50, 133–138. [Google Scholar] [CrossRef] [Green Version]
  23. Jarque-Bou, N.J.; Vergara, M.; Sancho-Bru, J.L.; Gracia-Ibáñez, V.; Roda-Sales, A. Hand kinematics characterization while performing activities of daily living through kinematics reduction. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1556–1565. [Google Scholar] [CrossRef] [PubMed]
  24. Li, X.; Wen, R.; Duanmu, D.; Huang, W.; Wan, K.; Hu, Y. Finger Kinematics during Human Hand Grip and Release. Biomimetics 2023, 8, 244. [Google Scholar] [CrossRef]
  25. Skerik, S.K.; Weiss, M.W.; Flatt, A.E. Functional evaluation of congenital hand anomalies. Am. J. Occup. Ther. Off. Publ. Am. Occup. Ther. Assoc. 1971, 25, 98–104. [Google Scholar]
  26. Haidacher, S. Contact Point and Object Position from Force/Torque and Position Sensors for Grasps with a Dextrous Robotic Hand. Ph.D. Thesis, Technische Universität München, Munich, Germany, 2004. [Google Scholar]
  27. Feix, T.; Romero, J.; Schmiedmayer, H.B.; Dollar, A.M.; Kragic, D. The GRASP Taxonomy of Human Grasp Types. IEEE Trans. Hum.-Mach. Syst. 2016, 46, 66–77. [Google Scholar] [CrossRef]
  28. Hu, D.; Ren, L.; Howard, D.; Zong, C. Biomechanical analysis of force distribution in human finger extensor mechanisms. BioMed Res. Int. 2014, 2014, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Erjiage, G.; Nassour, J.; Cheng, G. Multi-sensory fusion of wearable sensors for automatic grasping and releasing with soft-hand exoskeleton. In Proceedings of the 2023 IEEE International Conference on Soft Robotics (RoboSoft), Singapore, 3–7 April 2023; pp. 1–6. [Google Scholar] [CrossRef]
  30. Lee, K.S.; Jung, M.C. Ergonomic evaluation of biomechanical hand function. Saf. Health Work. 2015, 6, 9–17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Iyengar, V.; Santos, M.J.; Aruin, A.S. Role of movement velocity on the magnitude of grip force while lifting an object with touch from the contralateral finger. Motor Control 2009, 13, 130–141. [Google Scholar] [CrossRef] [Green Version]
  32. Tigue, J.A.; King, R.J.; Mascaro, S.A. Simultaneous kinematic and contact force modeling of a human finger tendon system using bond graphs and robotic validation. J. Dyn. Syst. Meas. Control 2020, 142, 031007. [Google Scholar] [CrossRef]
  33. Salisbury, J.K.; Roth, B. Kinematic and force analysis of articulated mechanical hands. J. Mech. Transm. Autom. 1983, 105, 35–41. [Google Scholar] [CrossRef]
  34. Carabello, A.; Henkner, R.; Drossel, W.G. Novel Procedure for Determining the Finger Force in Flexion Depending on the Finger Position. Curr. Dir. Biomed. Eng. 2022, 8, 384–387. [Google Scholar] [CrossRef]
  35. Butin, C.; Chablat, D.; Aoustin, Y.; Gouaillier, D. Novel Kinematics of an Anthropomorphic Prosthetic Hand Allowing Lateral and Opposite Grasp With a Single Actuator. J. Comput. Nonlinear Dyn. 2023, 18, 061005. [Google Scholar] [CrossRef]
  36. Park, J.; Xu, D. Multi-finger interaction and synergies in finger flexion and extension force production. Front. Hum. Neurosci. 2017, 11, 318. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Keenan, K.G.; Santos, V.J.; Venkadesan, M.; Valero-Cuevas, F.J. Maximal voluntary fingertip force production is not limited by movement speed in combined motion and force tasks. J. Neurosci. 2009, 29, 8784–8789. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Nicholas, J.W.; Corvese, R.J.; Woolley, C.; Armstrong, T.J. Quantification of hand grasp force using a pressure mapping system. Work 2012, 41, 605–612. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Kim, J.S.; Park, J.M. Physics-based hand interaction with virtual objects. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3814–3819. [Google Scholar] [CrossRef]
  40. Chalfoun, J.; Renault, M.; Younes, R.; Ouezdou, F.B. Muscle forces prediction of the human hand and forearm system in highly realistic simulation. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan, 28 September–2 October 2004; Volume 2, pp. 1293–1298. [Google Scholar] [CrossRef]
  41. Zhang, M.; Mak, A. In vivo friction properties of human skin. Prosthet. Orthot. Int. 1999, 23, 135–141. [Google Scholar] [CrossRef] [Green Version]
  42. Wang, H.; Hwang, S.U.; Lee, Y.G. Improved finger bending angles measurements for accurate interactions with virtual objects. Korean J. Comput. Des. Eng. 2008, 13, 323–333. Available online: https://www.koreascience.or.kr/article/JAKO200811850422716.page (accessed on 6 April 2022).
  43. Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
  44. Fragkiadakis, M.; Nyst, V.; van der Putten, P. Signing as input for a dictionary query: Matching signs based on joint positions of the dominant hand. In Proceedings of the sign-lang@ LREC 2020. European Language Resources Association (ELRA), Marseille, France, 11–16 May 2020; pp. 69–74. [Google Scholar]
  45. Osawa, R.; Ishikawa, T.; Watanabe, H. Pitching Motion Matching Based on Pose Similarity Using Dynamic Time Warping. In Proceedings of the 2020 IEEE 9th Global Conference on Consumer Electronics (GCCE), Kobe, Japan, 13–16 October 2020; pp. 1–5. [Google Scholar] [CrossRef]
  46. Rosenthal, R.; Cooper, H.; Hedges, L. Parametric measures of effect size. Handb. Res. Synth. 1994, 621, 231–244. [Google Scholar] [CrossRef]
  47. Slater, M. Presence and The Sixth Sense. Presence Teleoper. Virtual Env. 2002, 11, 435–439. [Google Scholar] [CrossRef]
Figure 1. The proposed grip-lift interaction based on three hand kinematic characteristics: (1) force at finger contact points, (2) finger flexion, and (3) instantaneous speed of the hand/finger. Depending on how much hand force is involved in the grip-lift interaction, the user may succeed or fail to lift a virtual object in an immersive VR environment.
Figure 1. The proposed grip-lift interaction based on three hand kinematic characteristics: (1) force at finger contact points, (2) finger flexion, and (3) instantaneous speed of the hand/finger. Depending on how much hand force is involved in the grip-lift interaction, the user may succeed or fail to lift a virtual object in an immersive VR environment.
Electronics 12 02794 g001
Figure 2. A virtual hand model that shows the finger parts (possible contact points) and the force weights. The yellow boxes are the offset areas including contact points. The boxes with the red circle are assigned with double weight—twice stronger force than the box without the red circle. The hand model was from Epic Games’ Unreal Engine.
Figure 2. A virtual hand model that shows the finger parts (possible contact points) and the force weights. The yellow boxes are the offset areas including contact points. The boxes with the red circle are assigned with double weight—twice stronger force than the box without the red circle. The hand model was from Epic Games’ Unreal Engine.
Electronics 12 02794 g002
Figure 3. Examples of finger flexion values for three joints.
Figure 3. Examples of finger flexion values for three joints.
Electronics 12 02794 g003
Figure 4. Grip-lift tasks used in the experiment: (a) baseline real experience: grip lifting a 1.8 kg real cylinder steel pole using the dominant hand in the real environment, (b) experimental virtual experience: grip lifting a corresponding virtual object in an immersive VR environment without the real object, and the participant’s egocentric view in VR.
Figure 4. Grip-lift tasks used in the experiment: (a) baseline real experience: grip lifting a 1.8 kg real cylinder steel pole using the dominant hand in the real environment, (b) experimental virtual experience: grip lifting a corresponding virtual object in an immersive VR environment without the real object, and the participant’s egocentric view in VR.
Electronics 12 02794 g004
Figure 5. Setting for quantitative hand pose similarity evaluation: (a) view in third person perspective with RGB camera, and (b) estimated pose with OpenPose [43].
Figure 5. Setting for quantitative hand pose similarity evaluation: (a) view in third person perspective with RGB camera, and (b) estimated pose with OpenPose [43].
Electronics 12 02794 g005
Figure 6. Results of the perceived realism using averaged scores in Q1–Q3. Medians and standard deviations are used for the bar plots. Statistical significance: * ( p < 0.05 ), ** ( p < 0.01 ), and *** ( p < 0.001 ).
Figure 6. Results of the perceived realism using averaged scores in Q1–Q3. Medians and standard deviations are used for the bar plots. Statistical significance: * ( p < 0.05 ), ** ( p < 0.01 ), and *** ( p < 0.001 ).
Electronics 12 02794 g006
Figure 7. Results of the usefulness (Q4). Medians and standard deviations are used for the bar plots. Statistical significance: * ( p < 0.05 ), ** ( p < 0.01 ), and *** ( p < 0.001 ).
Figure 7. Results of the usefulness (Q4). Medians and standard deviations are used for the bar plots. Statistical significance: * ( p < 0.05 ), ** ( p < 0.01 ), and *** ( p < 0.001 ).
Electronics 12 02794 g007
Figure 8. Results of objective hand pose similarity scores between each condition (LK, FK) and the baseline real setting (%). Blue boxes represent the cases when FK shows a higher hand pose similarity score than LK. The interaction scenarios related to (a) finger force, (b) finger flexion, and (c) instantaneous speed are described in Table 1.
Figure 8. Results of objective hand pose similarity scores between each condition (LK, FK) and the baseline real setting (%). Blue boxes represent the cases when FK shows a higher hand pose similarity score than LK. The interaction scenarios related to (a) finger force, (b) finger flexion, and (c) instantaneous speed are described in Table 1.
Electronics 12 02794 g008
Table 1. Grip-lift scenarios used in VR for our experiment. Six interaction scenarios were prepared to address different kinematic factors—two for each factor.
Table 1. Grip-lift scenarios used in VR for our experiment. Six interaction scenarios were prepared to address different kinematic factors—two for each factor.
Grip-Lift ScenarioTask SituationLabel
Finger forceUse the thumb and index finger to grip and lift the objectA
Use the thumb and little finger to grip and lift the objectB
Finger flexionBend the fingers and tightly grip the object to liftC
Bend the fingers and loosely grip the object to liftD
Instantaneous speedExert force to quickly grip and lift up the objectE
Exert force to slowly grip and lift up the objectF
Table 2. Questions to collect participants’ subjective perception of the virtual grip-lift interaction.
Table 2. Questions to collect participants’ subjective perception of the virtual grip-lift interaction.
Q1 (Result realism)The lifting result that I experienced in VR was similar to the result in the real situation with the real object.
Q2 (Force realism)The level of force applied to my hand to lift the virtual object was similar to the level in the real situation with the real object.
Q3 (Pose realism)My hand pose during the grip-lift interaction in VR was similar to the pose in the real situation with the real object.
Q4 (Usefulness)The grip-lift interaction that I experienced in VR would be useful to simulate realistic virtual training.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nam, H.; Kim, C.; Kim, K.; Park, J.-I. Physically Plausible Realistic Grip-Lift Interaction Based on Hand Kinematics in VR. Electronics 2023, 12, 2794. https://doi.org/10.3390/electronics12132794

AMA Style

Nam H, Kim C, Kim K, Park J-I. Physically Plausible Realistic Grip-Lift Interaction Based on Hand Kinematics in VR. Electronics. 2023; 12(13):2794. https://doi.org/10.3390/electronics12132794

Chicago/Turabian Style

Nam, Hyeongil, Chanhee Kim, Kangsoo Kim, and Jong-Il Park. 2023. "Physically Plausible Realistic Grip-Lift Interaction Based on Hand Kinematics in VR" Electronics 12, no. 13: 2794. https://doi.org/10.3390/electronics12132794

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop