Next Article in Journal
Towards Emotionally Expressive Virtual Human Agents to Foster L2 Production: Insights from a Preliminary Woz Experiment
Previous Article in Journal
Current Challenges and Future Research Directions in Augmented Reality for Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback

Department of Computer Graphics Technology, Purdue University, West Lafayette, IN 47907, USA
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2022, 6(9), 76; https://doi.org/10.3390/mti6090076
Submission received: 20 July 2022 / Revised: 29 August 2022 / Accepted: 2 September 2022 / Published: 6 September 2022

Abstract

:
We conducted a virtual reality study to explore virtual hand illusion through three levels of appearance (Appearance dimension: realistic vs. pixelated vs. toon hand appearances) and two levels of tactile feedback (Tactile dimension: no tactile vs. tactile feedback). We instructed our participants to complete a virtual assembly task in this study. Immediately afterward, we asked them to provide self-reported ratings on a survey that captured presence and five embodiment dimensions (hand ownership, touch sensation, agency and motor control, external appearance, and response to external stimuli). The results of our study indicate that (1) tactile feedback generated a stronger sense of presence, touch sensation, and response to external stimuli; (2) the pixelated hand appearance provided the least hand ownership and external appearance; and (3) in the presence of the pixelated hand, prior virtual reality experience of participants impacted their agency and motor control and their response to external stimuli ratings. This paper discusses our findings and provides design considerations for virtual reality applications with respect to the realistic appearance of virtual hands and tactile feedback.

1. Introduction

Hands are vital for daily life tasks, and the same applies to virtual reality (VR). Virtual hands are essential for both forceful and manipulation tasks [1]. In previous VR studies, the inclusion of virtual hands was proven to provide better task performance for object interaction [2]. Moreover, the existence of virtual hands improves visual cognition, due to increases in short-term memory [3], and enhances cognitive control [4]. In modern VR applications, the integration of virtual hands has become an essential design principle [5].
Nowadays, VR technologies allow users to experience a realistic, immersive environment while observing and feeling their virtual body or body parts. In addition, users can use their VR controllers to interact with virtual objects. Studies in the past demonstrated the importance of virtual hand appearance, indicating that with the existence of virtual hand models, the VR environment became more appealing to users [6,7,8]. Furthermore, a more realistic hand appearance style generates a stronger sense of presence for VR users [9,10] when compared to either mannequin hands or robot hands, or simple geometry blocks.
VR developers can also import somatosensory cues in VR by activating tactile feedback for the controllers. A previously conducted study showed that vibrotactile feedback enhances the sense of immersion and interactivity for VR users [11]. Moreover, recent studies have shown that the vibrotactile feedback of VR controllers could increase the perceived realism of the appearance of the virtual hands [12], as well as the sense of presence and involvement [13] of VR users.
“Virtual hand illusion” (VHI) describes the illusion of ownership of virtual hands [14]. It refers to the observation that VR users perceive “body ownership” if a virtual hand moves or senses feedback in a virtual environment. VHI studies have explored the effects of different hand model geometries (e.g., realistic hand shape, mannequin hand shape, low-poly blocks, and mini-sphere dots). They found that the actual hand shape generated more substantial ownership than the other geometries [9,15]. Further understanding of VHI would provide more detailed design principles for virtual hand models and methods to aid task performance [16,17,18].
As mentioned, studies in the past focused on different 3D models (3D geometries) of virtual hands [9,15], as well as on different sizes of virtual hands [19,20]. Specifically, previous studies [9,12,15] compared the differences between hand models of different appearances and geometries by applying large manipulations across the provided stimuli. Moreover, previous studies also applied similar shaders and found significant results in terms of human perception [21,22]. In fact, researchers found that less detailed shaders such as pixelation and toon were more likable to study participants [23]. Having more detail on virtual agents would cause the perceived eeriness to decrease [22]. In this study, we decided to use minor manipulation (same hand geometry, different shaders) to study in more detail whether appearance differences can affect the virtual hand illusion. To the best of our knowledge, no study has been conducted to explore manipulation based only on hand appearance (render style). For this reason, we conducted a 3 (Appearance: realistic vs. pixelated vs. toon) × 2 (Tactile: tactile vs. no tactile feedback) study to further explore the realistic appearance of virtual hands (see Figure 1) and tactile feedback in virtual reality.
Note that shaders are widely used in games to provide not only detailed rendering (e.g., shadows, lighting, and landscape) [24], but also to generate more realistic visual effects (e.g., sprites, particles, and texture gradients) in virtual environments [25,26]. We think that understanding the impact of the realistic appearance of virtual hands will improve our understanding regarding VHI and thus provide more insights to researchers and VR developers. To do so, and based on our study design, we aim to answer the following research questions:
  •  RQ1: Do different hand appearances affect our participants’ sense of presence and VHI?
  •  RQ2: Does tactile feedback affect our participants’ sense of presence and VHI?
  •  RQ3: Does prior VR experience affect our participants’ sense of presence and VHI?
This paper’s structure is as follows: Section 2 presents work related to our study. Section 3 provides details of the methodology followed. Section 4 presents the results obtained. In Section 5, we discuss our findings, along with our limitations and design considerations. Finally, we discuss our conclusions and future work directions in Section 6.

2. Related Work

Embodiment, the sense of substituting a user’s body with a virtual avatar into a first-person view [27], has been studied widely in projects exploring self-motion artifacts [28], visuomotor calibration [29], tactile sensation [30], and external appearance [31]. Previously published research has revealed various factors that can impact embodiment, such as the location of the body, body ownership, agency and motor control, and external appearance [32]. Previous studies have indicated that a virtual avatar in a similar location to the participant [33], agency over the virtual body (such as the synchronous movement of body parts) [34], and both look-alike self-avatars and self-avatars of different genders [35] and racial groups [36] enhance the illusion of embodiment. In summary, participants with a collocated body will experience embodiment illusion, and appearance and agency can enhance that illusion [32].
The virtual hand has become an essential topic in understanding embodiment since it is the most used and seen body part in a first-person view in a virtual environment. Users can perceive their virtual hands using either explicit (actively assessing body size and shape, e.g., “How long is my hand?”) [37] or implicit methods (finding landmarks on specific body parts, e.g., “Where is my wrist?”) [38]. In VR, these methods correspond to visual and haptic methods [39]. Studies in VR have found both ways to be effective, by providing a realistic hand appearance for VR hand models [9,15,40] and tactile feedback for participants to locate their hands and landmarks [12].
With the development of different input devices, such as finger-tracking gloves and motion controllers, VR hands can now achieve a high level of interaction [41]. In designing interaction techniques, previously conducted studies compared various input methods. Moehring and Froehlich [42] compared finger-based and controller-based control in both head-mounted display (HMD) and cave automatic virtual environment (CAVE) setups. They found that finger-based interaction was preferred, but controller-based interaction had better performance [42]. Lin and Schulze compared grasping gesture and controller manipulation for object interaction [43]. They found that participants favored gestures but felt more reliable using VR controllers [43]. Note that finger- and gesture-based interaction caused uncertainties due to limited capabilities and reliability when compared with controller buttons [43,44].
One crucial factor commonly studied with virtual hands is tactile feedback, which we perceive through our skin. As mentioned before, the tactile sensation is an essential factor contributing to embodiment [30]. Therefore, understanding the perception of tactile feedback has become crucial for VR studies. A previously conducted study showed that different sensation areas have different sensitivity thresholds for tactile feedback frequencies [45]. Moreover, a previously published study indicated that the sense of tactile feedback on specific body parts, such as the elbow and wrist, could produce acute spatial correlation [46]. On that basis, Gaudeni et al. [47] successfully relocated sensed body parts using a haptic ring to deliver fingertip tactile information to a different hand location.
Modern approaches have found a significant effect of vibrotactile feedback on embodiment and VHI in VR studies for various tasks [48,49,50]. Tactile feedback activated on either the whole body or body parts improved users’ presence in immersive environments for rehabilitation [51], training [52], gaming [53], and other close-contact interactions with virtual characters [54,55,56]. The studies conducted by Gusterstam et al. [57] and Fossataro et al. [58] indicated that the tactile feedback of a virtual hand increases an individual’s sense of ownership of hands and arms. Furthermore, Frohner et al. [48] concluded that vibrotactile feedback fostered hand embodiment. In a different approach, Moore et al. [49] explored the effect of tactile feedback patterns (natural, natural plus local vibratory, local vibratory, proximal vibratory, and no tactile feedback) on VHI. Although Moore et al. [49] did not find significant differences, all tactile feedback patterns generated VHI.
Besides VHI studies conducted in virtual environments, many studies have explored hand ownership in the real world. In rubber hand illusion (RHI) studies, having a synchronous stroke on both the real and virtual limb was proven effective in generating a sense of ownership over the fake limb [59,60]. In a later study, Tsakiris and Haggard [61] revisited the RHI study and found that the rubber hand illusion resulted from the adaption of the mislocalization of the simulated body part. They suggested that the bottom-up building process of visuotactile correlation was critical to the illusion [61]. Moreover, the appearance of the fake limb also impacted the RHI [62,63]. In a study with different shapes of limbs (four paper-cut cardboard limbs and a realistic rubber hand), the realistic 3D shape generated the strongest ownership. Therefore, the body model played an essential role in maintaining a coherent sense of one’s body [62].
Based on RHI and VHI studies, recent virtual hand illusion studies have focused on exploring elements that could affect the body ownership of virtual hands in virtual environments. Previous studies have found that multisensory integration could generate a sense of hand ownership [59]. In setups similar to the RHI studies, VR studies found significant effects of virtual hand appearance [9,14,15,19]. A realistic hand appearance generated the strongest VHI compared with other hand models [9,15]. Pyasik et al. [64] mentioned that a detailed hand appearance could be an essential factor contributing to VHI.
Another direction of VHI studies was the relationship between agency and motor control. Sanchize-Vives et al. [65] asserted that synchrony between visual and proprioceptive information induces an illusion of ownership. Brugada-Ramentol [66] verified this by finding that incongruent limb control decreases the sense of ownership. Last, other studies found that a high physical load alongside multisensory integration with free movements for virtual hands boosts implicit agency and improves VHI [67,68].
Although there are many studies on the effect of different hand models, little knowledge is available to understand the impact of a hand’s realistic appearance on VHI. A previous study showed that appearance fidelity was highly related to 3D rendering quality [69], which is considered an important design element, similar to audio and video quality [70,71]. Rendering style was the most popular technique to determine the rendering output of a 3D environment [24]. Many VR studies used different rendering styles to explore the impact of a realistic appearance [69,72,73]. In conclusion, a realistic appearance had a significant effect on the quality of experience and task completion time [69], change in emotional state [72], and perception of human expression [73]. It is necessary to explore the realistic appearances of virtual hands to understand the potential effects on VHI. In our study, we decided to explore the differences in the rendering styles of a realistic hand model. We decided to only use a single hand model (a realistic hand model) in our experiment across all conditions because we wanted only a single appearance manipulation to be present at a time. Thus, this study utilizes knowledge from previously published research and builds upon it. We conducted a 3 (Appearance) × 2 (Tactile) study to determine the key factors contributing to the sense of presence and the five abovementioned body ownership dimensions. Our results will contribute to a further understanding of the factors contributing to VHI.

3. Methodology

The subsections below present the methodology we followed in this study.

3.1. Participants

We conducted an a priori power analysis to determine the sample size of this study. For our calculations, we used 95% power, a medium-to-large effect size of f = 0.32 , six ( 3 × 2 ) groups, and an α = 0.05 . The analysis resulted in a recommended sample size of 120 participants (20 participants per group).
We recruited 120 participants (18–48 years old; M = 19.49 , S D = 3.10 ) for our study through emails and class announcements made to undergraduate and graduate students in our department. Among our participants, 84 were male and 36 were female, equally distributed across participant groups. From the sample, 41 participants had no prior experience with VR, and 79 participants had previously experienced VR. All students volunteered for our study.

3.2. Virtual Reality Application

We developed our VR application for the study in Unreal Engine 4 and deployed it on an Oculus Quest 2 head-mounted display. The Oculus Quest 2 set had two controllers and an HMD with an internal camera sensor. The Quest 2 HMD was connected to a PC to run the application. We developed an instrumental task for our study because we noted several increasing applications of training and performing activities in VR. Additionally, we expected our participants to use their hands actively in performing certain tasks rather than passively observing their bodies. In our application, we implemented two sections: the first was the training (tutorial) section and the second wa the testing section (see Figure 2). We implemented the tutorial section to provide our participants with instructions and familiarize them with the use of virtual reality controllers. Doing so has been shown to significantly improve virtual reality users’ performance [74]. Specifically, the tutorial section was reasonably straightforward, including only a gear part, a transparent gear slot, and a red button that started the testing section. The participants had the freedom to move around in the environment and pick up and assemble the gear onto the transparent slot. Participants would only need to use the grip button to “grab” the gear 3D model and “assemble” it by moving the gear toward the transparent slot. When the gear 3D model was close enough to the target slot, the gear would automatically “snap” onto the slot to stay in place. Participants could press the red button to proceed into the testing section whenever they were ready for the experiment.
The testing section included a long counter in the front, and a large white cube in the center with four transparent slots on the front side and four gears on the cube’s left side. We created six variations of the testing section, where each variation was assigned one of the hand appearances, with or without tactile feedback activated. We set the frequency and the intensity of the tactile feedback of the Oculus Touch controllers to maximum (frequency = 1.00 and intensity = 1.00). The grasping motion of the hand was the same for all conditions. The task for the participants was to assemble all four gears onto the cube. After that, a transparent sphere would appear in the center of the cube’s front side. Participants would put their right hand inside the 3D sphere model for two seconds. Then, a white spike would come out to “pierce” the virtual hand. Depending on the participant group, the tactile feedback of the controller was either activated or not. The spike was added as a metaphor to replicate the rubber hand illusion study. In the initial RHI study, a knife hit the fake limb at the end to explore the effect of synchronous stokes on hand ownership [59].

3.3. Experimental Conditions

By following a 3 (Appearance dimension: realistic vs. pixelated vs. toon hand appearances) × 2 (Tactile dimension: no tactile vs. tactile) between-group study design, we developed six experimental conditions: realistic tactile, pixelated tactile, toon tactile, realistic no tactile, pixelated no tactile, and toon no tactile. We used a disembodied body (“floating” hand) because we considered previous studies [12,48,75] and commercial applications, such as BOXVR and Job Simulator, that also used “floating” hands. Each group consisted of 20 participants (14 male and 6 female). Note that, in the conditions in which we activated the tactile feedback, the controller would vibrate when grabbing the objects and getting “pierced” by the spike.

3.4. Measurements

In our study, we collected self-reported data using the Slater–Usoh–Steed (SUS) scale [76] on presence, as well as though the avatar embodiment scales (virtual hand ownership, touch sensations, agency and motor control, external appearance, and response to external stimuli) by Gonzalez-Franco and Peck [32]. We used a 7-point Likert scale for all questions/statements (see Table 1). We disseminated the scales to our participants using the Qualtrics online survey tool offered by our university. Note that we made the necessary adjustments to the questions/statements to fit the scope of the study.

3.5. Procedure

Upon arrival at our research lab, where the study took place, we asked the participants to read the consent form and sign it upon their agreement. The Institutional Review Board (IRB) of the university approved our study. After agreeing to participate in our research and signing the consent form, we asked the participants to provide demographic data in Qualtrics. Then, we asked them to put on the Oculus Quest 2 HMD and adjust it to their heads accordingly. After the adjustments and after indicating that they were ready for the study, the experimenter started the application from the tutorial section. After becoming familiar with the operations, the participants pressed the red button on the table to continue to the testing section, where they experienced one of the experimental conditions. As mentioned in the testing section, we asked the participants to finish the four-gear assembly first and then to experience the “piercing” of the spike. We used the four-gear assembly task to induce the participants into the virtual environment and to start thinking about their virtual hand as their hand. Then, the spike “pierce” was used as a metaphor for the RHI experiment. After the participants pressed the red button again, they would end the application and take off the headset. We then instructed the participants to fill out the post-survey. After that, the researcher answered the participants’ questions, and, finally, the participants were thanked and dismissed. Each participant spent less than 30 min in our study. Figure 3 illustrates the experimental setting of our research and a participant using our application.
Considering the current COVID-19 pandemic and following our institution’s instructions, we allowed a 30-minute gap between participants to minimize infection risk. Participants were either vaccinated or followed the instructions of our university regarding regular testing (at least once per week) for COVID-19. Once the participants arrived at the lab, we asked them to sanitize their hands, and during the study, all participants wore face masks and VR sanitary masks. Additionally, the research team carefully sanitized all equipment and furniture before the participants’ arrival and after each study session.

4. Results

In this section, we report all the results of this study. We performed all analyses using the IBM SPSS v.26.0 statistical analysis software. The normality assumption of all self-reported ratings was screened with Shapiro–Wilk tests at the 5% level and graphically using the Q-Q plots of the residuals. We tested the measured items of the scales for reliability (Cronbach’s alpha: 0.72 α 0.90 ), and due to sufficient correlation, we used the cumulative score of all items for each scale as the final result and treated them as continuous scales [77,78], which is a typical practice. No removal of items would have enhanced the internal reliability of the scales. For all statistical tests, p < 0.05 was deemed as statistically significant. In addition to the results presented below, we would like to note that we also screened our data for gender and age differences. We did not find significant results; therefore, we do not report such analyses below. Figure 4 illustrates the boxplots of our results.

4.1. Presence

In terms of presence, according to the two-way analysis of variance (ANOVA), we did not find an (Appearance × Tactile) interaction effect ( F ( 2 , 118 ) = 0.245 , p = 0.783 , η p 2 = 0.004 ). However, we found a significant main effect for the Tactile dimension ( F ( 1 , 119 ) = 9.532 , p = 0.003 , η p 2 = 0.077 ). Finally, we did not find a statistically significant main effect for the Appearance dimension ( F ( 1 , 119 ) = 0.166 , p = 0.847 , η p 2 = 0.003 ).

4.2. Hand Ownership

The two-way ANOVA did not reveal an (Appearance × Tactile) interaction effect ( F ( 2 , 118 ) = 0.045 , p = 0.956 , η p 2 = 0.001 ) on hand ownership. However, we found a significant main effect for Appearance ( F ( 1 , 119 ) = 11.226 , p = 0.000 , η p 2 = 0.165 ). Post hoc comparison using Bonferroni corrected estimates showed that participants with a pixelated hand appearance ( M = 4.18 ) rated significantly lower the level of hand ownership than those with the toon-hand appearance ( M = 4.95 ) and realistic hand appearance ( M = 5.35 ) conditions. Finally, we did not find a significant main effect for the Tactile dimension ( F ( 1 , 119 ) = 0.933 , p = 0.336 , η p 2 = 0.008 ).

4.3. Touch Sensation

The two-way ANOVA showed no (Appearance × Tactile) interaction effect ( F ( 2 , 118 ) = 0.045 , p = 0.956 , η p 2 = 0.001 ) in touch sensation. However, we found a significant main effect for the Tactile dimension ( F ( 1 , 119 ) = 4.924 , p = 0.028 , η p 2 = 0.041 ). Post hoc comparison showed that participants exposed to the no-tactile level ( M = 3.84 ) rated their touch sensation significantly lower than those exposed to the tactile level ( M = 4.25 ). Finally, we did not find any statistically significant main effect for the Appearance dimension ( F ( 1 , 119 ) = 2.380 , p = 0.097 , η p 2 = 0.040 ).

4.4. Agency and Motor Control

In terms of agency and motor control, the two-way ANOVA did not reveal an (Appearance × Tactile) interaction effect ( F ( 1 , 119 ) = 0.094 , p = 0.910 , η p 2 = 0.002 ). We also did not find any statistically significant main effect in either the Appearance dimension ( F ( 1 , 119 ) = 1.597 , p = 0.207 , η p 2 = . 027 ) or Tactile dimension ( F ( 1 , 119 ) = 0.132 , p = 0.717 , η p 2 = 0.001 ).

4.5. External Appearance

In terms of external appearance, the analysis did not reveal an (Appearance × Tactile) interaction effect ( F ( 2 , 118 ) = 0.005 , p = 0.995 , η p 2 = 0.000 ). However, we found a significant main effect for the Appearance dimension ( F ( 1 , 119 ) = 6.402 , p = 0.002 , η p 2 = 0.101 ). Participants assigned to the realistic hand appearance ( M = 4.57 ) rated higher the external appearance of the virtual hand model than those exposed to the pixelated hand appearance ( M = 3.47 ). Finally, we did not find a statistically significant main effect for the Tactile dimension ( F ( 1 , 119 ) = 1.886 , p = 0.172 , η p 2 = 0.016 ).

4.6. Response to External Stimuli

No (Appearance × Tactile) interaction effect was found ( F ( 2 , 118 ) = 1.073 , p = 0.346 , η p 2 = 0.018 ) on the response to external stimuli. However, we found a significant main effect for the Tactile dimension ( F ( 1 , 119 ) = 9.740 , p = 0.002 , η p 2 = 0.079 ). Post hoc comparison showed that participants exposed to the no-tactile level ( M = 3.34 ) rated their response to external stimuli significantly lower than those exposed to the tactile level ( M = 4.04 ). Finally, the analysis did not provide a statistically significant main effect for the Appearance dimension ( F ( 1 , 119 ) = 1.997 , p = 0.140 , η p 2 = 0.034 ).

4.7. Prior VR Experience

We conducted a three-way ANOVA (Appearance × Tactile × Prior VR Experience) to explore whether prior VR experience could potentially impact participant ratings (see Figure 5 for the interaction plots). For the agency and motor control, our analysis did not reveal either a three-way (Appearance × Tactile × Prior VR Experience) interaction effect ( F ( 2 , 118 ) = 0.393 , p = 0.676 , η p 2 = 0.007 ) or a two-way (Tactile × Prior VR Experience) interaction effect ( F ( 1 , 119 ) = 0.103 , p = 0.749 , η p 2 = 0.001 ). However, it revealed a two-way (Appearance × Prior VR Experience) interaction effect ( F ( 1 , 119 ) = 3.594 , p = 0.031 , η p 2 = 0.062 ). In the presence of a pixelated hand appearance, we found that participants with no prior VR experience provided a higher rating on agency and motor control than those with prior VR experience. We also found a significant main effect for Prior VR Experience ( F ( 1 , 119 ) = 6.831 , p = 0.010 , η p 2 = 0.059 ). Pairwise comparison showed that participants with no prior VR experience provided lower ratings for the agency and motor control than those with prior VR experience.
We also explored the response to external stimuli variable. According to our results, the ANOVA did not reveal a three-way (Appearance × Tactile × Prior VR Experience) interaction effect ( F ( 2 , 118 ) = 0.208 , p = 0.813 , η p 2 = 0.004 ). Moreover, we did not find a two-way (Tactile × Prior VR Experience) interaction effect ( F ( 1 , 119 ) = 2.544 , p = 0.114 , η p 2 = 0.023 ). However, we did find a statistically significant two-way (Appearance × Prior VR Experience) interaction effect ( F ( 2 , 118 ) = 3.559 , p = 0.032 , η p 2 = 0.062 ). Specifically, we found that in the presence of the pixelated hand appearance, participants with no prior VR experience provided lower ratings of their responses to external stimuli compared with those with prior VR experience. Finally, we did not find a main effect of Prior VR Experience in the response to external stimuli ( F ( 1 , 119 ) = 0.436 , p = 0.510 , η p 2 = 0.004 ).

5. Discussion

In this study, we aimed to explore how the appearance of virtual hands (Appearance dimension) and tactile feedback (Tactile dimension) impacted our participants in terms of presence and body ownership dimensions (hand ownership, touch sensation, agency and motor control, external appearance, and response to external stimuli). In addition, we reported significant findings regarding the effect of participants’ prior VR experience. The subsections below discuss our findings and our study’s limitations and design considerations.

5.1. RQ1: Realistic Appearance

To answer RQ1, according to our results, we first found significant Appearance effects on hand ownership. Participants who had the pixelated hand reported significantly lower hand ownership than those who had experienced the toon and realistic hands. Previous studies have already proved the effect of avatar realism in immersive environments, indicating that a realistic avatar tends to generate stronger virtual body ownership [62,79,80]. Moreover, studies regarding virtual hands specifically mentioned that a more realistic/human-like hand style would provide stronger hand illusion and perceived limb ownership when compared with hand models of different geometries and abstractions [9,15,19]. In addition, Pyasik [64] found that detailed hand appearance could be a contributing factor for VHI, and later studies [64,81] confirmed this conclusion. In our research, the pixelated hand appearance obscured the details of the virtual hand. Our study results extend the previously reported findings indicating that even appearance changes (we kept the geometry of the virtual hand as is) could also impact the hand ownership illusion of participants. We think that the pixelated hand provided a sense of style clash in contrast with the virtual environment to the participants, which made them rate the hand ownership as lower compared with those exposed to the toon or realistic hand appearance. Therefore, we think a more detailed appearance (realistic and toon hand appearance) was logically likelier to generate a stronger sense of hand ownership than a less detailed one (pixelated hand appearance). Furthermore, no participants provided negative comments about using a hand model that resembles a real hand, which is a similar finding to previously published studies [9,15].
Our statistical analysis also revealed statistically significant results of Appearance on external appearance. Participants exposed to the pixelated hand rated the appearance lower than those exposed to the realistic hand appearance. Previously conducted studies revealed that avatar body style was related to appearance realism [40,82], as well as that the realistic hand model had a stronger effect on the ratings of external appearance compared to other less realistic hands [12]. In addition, based on the comments from the participants, the pixelated hand was referred to as “blurry,” “mosaic,” and “not clear enough to recognize.” In the previous studies, researchers mentioned the visual perception of virtual hands in implicit and explicit methods [37,38,83]. Referring to the comments of our participants (e.g., “I felt my hand was a bit blurry,” or “I could not connect my pixelated hand to the real hand”), we think the pixelated hand appearance did not help participants perceive the appearance of the virtual hands in both explicit (size and shape) [37,83] and implicit (location of landmarks—in this case, finger joints and wrist) [38] directions—the pixelated hand appearance made the virtual hand unclear and provided less information. The aforementioned is especially true when considering that the realistic hand model contributed a distinct visual appearance.

5.2. RQ2: Tactile Feedback

There were several statistically significant results in terms of tactile feedback (RQ2). We first found a significant effect on presence. Participants exposed to no tactile feedback conditions rated their presence levels lower than those exposed to tactile feedback. Previously conducted studies have found that tactile feedback in different avatar body parts can generate a stronger sense of presence [30,84], as well as that those participants exposed to no tactile feedback conditions reported lower presence levels than those exposed to tactile feedback [53]. Khamis et al. [13] also explored the effect of tactile feedback on presence. Their study proved that activating tactile feedback resulted in a higher reported sense of presence when compared with no tactile feedback conditions [13]. In our experiment, tactile feedback was activated when touching the gears and being pierced by the spike. We think that tactile feedback made participants feel that their hands were in the virtual world along with other objects. In contrast, the participants who were not exposed to tactile feedback indicated a lack of relationship between the existence of the virtual world and object interaction, which is a result that confirms previously published works [13,30,53,84].
Participants who were exposed to tactile feedback reported stronger tactile sensations when interacting with virtual objects than those not exposed to tactile feedback. Fossataro et al. [58] found that tactile feedback could reduce delusional body ownership for brain-damaged patients. This result reveals that tactile sensation could regenerate and enhance touch sensation for hands. Other studies proved that tactile feedback provided a stronger sense of touch to VR users [11,85]. In this experiment, we instructed the participants to grasp and manipulate gears. The tactile feedback was activated when grabbing the gear, providing participants with a sense of touch location and object interaction.
Finally, we found that tactile feedback significantly affected the response to external stimuli. For our experiment, the RHI idea was borrowed and altered. We used a spike to pierce the participants’ right hand. Participants who were exposed to tactile feedback rated their response to external stimuli more highly than those who were not exposed to tactile feedback. In the original rubber hand illusion study, Botvinick and Cohen [59] found that synchronous stroke generated the sense of “phantom limbs” and noted the reactions of participants after the “stab.” In a more advanced setup, D’Alonzo et al. [86] also found that vibrotactile feedback impacted the phantom limb sensation on a prosthetic limb, while also causing a stronger reaction to stimulation. Vargas et al. [83] mentioned that vibrotactile feedback could increase the proprioceptive recognition over a fake limb and could be the reason for a more significant reaction to external stimuli. As for our experiment, the assembly process made participants feel that the virtual hand became their hand, since they controlled it. Therefore, the activated tactile feedback caused participants to have a more realistic reaction when the spike came out and pierced their virtual hands. On the flip side, participants who were not exposed to tactile feedback did not react to the spike, even though they saw the spike piercing their virtual hand.

5.3. RQ3: Prior VR Experience

We found interesting results induced in our participants by their Prior VR Experience (RQ3). Our first finding was the interaction effect between Appearance and Prior VR Experience on agency and motor control. We found that participants with no prior VR experience reported higher ratings of agency and motor control in the presence of the pixelated hand appearance. As for the interaction effect, perhaps participants with no prior VR experience did not have in mind other virtual hand appearances to compare. Therefore, they decided to rate their agency higher than the participants with prior VR experience that most likely had in mind a reference hand used in the past. We also found a significant main effect on agency and motor control. Participants with prior VR experience rated these aspects higher than those with no prior VR experience. According to Haggard [87], past VR experience could be a reason for the impact on agency during specific actions. Bregman-Hai et al. [88] also revealed the relationship between memories and agency. In our study, most participants had previously experienced VR. They were familiar with VR controls and could finish the tasks smoothly. On the flip side, we think that participants with no prior VR experience had to make extra effort to get used to VR devices, and perhaps this was why they decided to rate their agency and motor control as lower.
We found another significant interaction effect between Prior VR Experience and Appearance on response to external stimuli. Specifically, we found that in the presence of the pixelated hand appearance, participants with no prior VR experience provided lower ratings of the response to external stimuli compared with those with prior VR experience. Based on the comments from our participants, those who did not have prior VR experience considered this VR application interesting and provided higher ratings when exposed to the pixelated virtual hand. The VR-naïve participants did not mention any contradiction between the virtual hand style and the tactile feedback. On the other hand, the VR-experienced participants reported unnaturalness with the virtual hand. They expected a higher fidelity realistic hand for the interaction. We think that the pixelated hand, which had a “mosaic” appearance, made participants feel that such a hand model could not provide a realistic response to external stimuli. For this reason, this participant group decided to give lower ratings. Moreover, the participant group with prior VR experience might have found a match between the low-quality hand appearance and tactile feedback. We think that this “quality matching” would positively impact such a group of participants. However, we could not find any previously conducted studies to support such findings. Therefore, we argue that future research exploring this interaction effect is necessary to understand such results further.

5.4. Limitations

Our study has several limitations that should be reported and considered in future studies. Unfortunately, most of our participants were undergraduate students from our department. Thus, we could not divide them into near-equal-size age groups due to the close age range. Furthermore, although we tried to recruit female participants, there was still a difference between male and female participants in each group. We think that such a balance (14 males and 6 females) might have impacted our findings.
The hardware (tactile feedback device) used in this study is another limitation. The controllers of Oculus Quest cannot provide rich contact and shape information when interacting with virtual objects. In our case, compared with haptic gloves and fingertip devices, the controller could not generate detailed feedback for the gear and the spike. Pacchierotti et al. [89] asserted the importance of tactile feedback quality. Thus, we think that using a high-quality tactile feedback device would provide us with more accurate study results.
Furthermore, participants were exposed to a relatively short tutorial section. Based on our discussion, we assume that participants without prior VR experience did not have enough time to get used to the application and the VR system. Therefore, a longer and more complex tutorial section might be necessary for better adaptation.
We consider the design of the spike as an additional limitation. Some participants were able to sense the tactile feedback of the spike but did not instantly notice the spike. Other studies, such as Lin and Jörg [9], used a knife falling from midair, which was more visually appealing. Future studies should provide a more exaggerated visualization as a VR metaphor for RHI.
Finally, we would like to mention a limitation with regard to handedness. We noticed that some participants used their left hand as their dominant hand during the study. We asked our participants to use their right hands to experience the spike in the spike part of the study. It would be interesting to explore the effect of the dominant hand in this scenario.

5.5. Design Considerations

We think the knowledge obtained from this study should be documented to help the research community develop more efficient VR assembly applications that involve virtual hand appearance and tactile feedback. Therefore, this section presents our reflections on choosing an appropriate hand representation and the usage of tactile feedback.
Because hand appearance does affect hand ownership and external appearance, we argue that it is more appropriate to use toon- or realistic-style shaders for virtual hands. Unless a future study confirms the positive findings on the matching between low-quality hand appearance and tactile feedback, we support the avoidance of using pixelated hand appearances as such representation induces lower VHI, and people regard it as “unable to perceive” and “not visually clear.” On the flip side, in terms of tactile feedback, we recommend the activation of vibrotactile feedback to provide a realistic VR experience when interacting with objects, since tactile feedback impacts presence, touch sensation, and response to external stimuli. We think all of the above could be effective tools for enhancing virtual hand illusion.

6. Conclusions and Future Work

We conducted a VR study to explore the effects of hand appearance and tactile feedback in terms of presence and body ownership dimensions. We developed a VR application in which participants were asked to assemble different parts into a 3D model, with an imitation of the RHI study at the end of the tasks. On one hand, in response to RQ1, a pixelated hand appearance induced a lower perception of hand ownership and external appearance. On the other hand, in response to RQ2, tactile feedback generated a stronger sense of presence, touch sensation, and response to external stimuli. To answer RQ3, we found that prior VR experience enhances agency and motor control while also reducing the response to external stimuli ratings.
As for our future work, there are several avenues to investigate. In modern VR, we already have applications that can make good use of legs, feet, and even the upper body. Thus, in future research, we would like to explore the effect of the realistic appearance of the whole body of self-avatars. Moreover, the effect of the parameters of tactile feedback (e.g., intensity and frequency) on the ownership of virtual hands could be considered a direction worth exploring. Such studies can help us to better understand how we can evaluate the realistic appearance and tactile feedback in self-avatars and user-controlled body parts in VR applications. Finally, we can also deepen the research on hand ownership using the Oculus Quest built-in hand interaction model for more approaches than real-world controllers.

Author Contributions

Conceptualization, D.C. and C.M.; methodology, D.C.; software, D.C.; validation, D.C. and C.M.; data curation, D.C.; writing—original draft preparation, D.C.; writing—review and editing, D.C. and C.M.; supervision, C.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Purdue University (protocol code: IRB-2021-1435; date of approval: 10-06-2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data and experimental materials used in this study are available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. De Monsabert, B.G.; Edwards, D.; Shah, D.; Kedgley, A. Importance of consistent datasets in musculoskeletal modelling: A study of the hand and wrist. Ann. Biomed. Eng. 2018, 46, 71–85. [Google Scholar] [CrossRef] [PubMed]
  2. Peck, T.C.; Tutar, A. The impact of a self-avatar, hand collocation, and hand proximity on embodiment and stroop interference. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1964–1971. [Google Scholar] [CrossRef]
  3. Tseng, P.; Bridgeman, B. Improved change detection with nearby hands. Exp. Brain Res. 2011, 209, 257–269. [Google Scholar] [CrossRef]
  4. Weidler, B.J.; Abrams, R.A. Hand proximity—Not arm posture—Alters vision near the hands. Atten. Percept. Psychophys. 2013, 75, 650–653. [Google Scholar] [CrossRef] [PubMed]
  5. Kondo, R.; Sugimoto, M.; Minamizawa, K.; Hoshi, T.; Inami, M.; Kitazaki, M. Illusory body ownership of an invisible body interpolated between virtual hands and feet via visual-motor synchronicity. Sci. Rep. 2018, 8, 7541. [Google Scholar] [CrossRef] [PubMed]
  6. Zell, E.; Aliaga, C.; Jarabo, A.; Zibrek, K.; Gutierrez, D.; McDonnell, R.; Botsch, M. To stylize or not to stylize? The effect of shape and material stylization on the perception of computer-generated faces. ACM Trans. Graph. (TOG) 2015, 34, 1–12. [Google Scholar] [CrossRef]
  7. Höll, M.; Oberweger, M.; Arth, C.; Lepetit, V. Efficient physics-based implementation for realistic hand-object interaction in virtual reality. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 175–182. [Google Scholar]
  8. Hoyet, L.; Argelaguet, F.; Nicole, C.; Lécuyer, A. “Wow! i have six Fingers!”: Would You accept structural changes of Your hand in Vr? Front. Robot. AI 2016, 3, 27. [Google Scholar] [CrossRef]
  9. Lin, L.; Jörg, S. Need a hand? How appearance affects the virtual hand illusion. In Proceedings of the ACM Symposium on Applied Perception, Anaheim, CA, USA, 22–23 July 2016; pp. 69–76. [Google Scholar]
  10. Schwind, V.; Knierim, P.; Tasci, C.; Franczak, P.; Haas, N.; Henze, N. “These are not my hands!” Effect of Gender on the Perception of Avatar Hands in Virtual Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1577–1582. [Google Scholar]
  11. Pamungkas, D.S.; Ward, K. Electro-tactile feedback system to enhance virtual reality experience. Int. J. Comput. Theory Eng. 2016, 8, 465. [Google Scholar] [CrossRef]
  12. Cui, D.; Mousas, C. An on-site and remote study during the COVID-19 pandemic on virtual hand appearance and tactile feedback. Behav. Inf. Technol. 2021, 40, 1278–1291. [Google Scholar] [CrossRef]
  13. Khamis, M.; Schuster, N.; George, C.; Pfeiffer, M. ElectroCutscenes: Realistic Haptic Feedback in Cutscenes of Virtual Reality Games Using Electric Muscle Stimulation. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, Sydney, Australia, 11–15 September 2019; pp. 1–10. [Google Scholar]
  14. Ma, K.; Hommel, B. The virtual-hand illusion: Effects of impact and threat on perceived ownership and affective resonance. Front. Psychol. 2013, 4, 604. [Google Scholar] [CrossRef] [Green Version]
  15. Schwind, V.; Lin, L.; Di Luca, M.; Jörg, S.; Hillis, J. Touch with foreign hands: The effect of virtual hand appearance on visual-haptic integration. In Proceedings of the 15th ACM Symposium on Applied Perception, Vancouver, BC, Canada, 10–11 August 2018; pp. 1–8. [Google Scholar]
  16. Schild, J.; Flock, L.; Martens, P.; Roth, B.; Schünemann, N.; Heller, E.; Misztal, S. EPICSAVE Lifesaving Decisions—A Collaborative VR Training Game Sketch for Paramedics. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; p. 1389. [Google Scholar]
  17. Walker, J.M.; Blank, A.A.; Shewokis, P.A.; O’Malley, M.K. Tactile feedback of object slip facilitates virtual object manipulation. IEEE Trans. Haptics 2015, 8, 454–466. [Google Scholar] [CrossRef]
  18. Schiefer, M.A.; Graczyk, E.L.; Sidik, S.M.; Tan, D.W.; Tyler, D.J. Artificial tactile and proprioceptive feedback improves performance and confidence on object identification tasks. PLoS ONE 2018, 13, e0207659. [Google Scholar] [CrossRef] [PubMed]
  19. Jung, S.; Bruder, G.; Wisniewski, P.J.; Sandor, C.; Hughes, C.E. Over my hand: Using a personalized hand in vr to improve object size estimation, body ownership, and presence. In Proceedings of the Symposium on Spatial User Interaction, Berlin, Germany, 13–14 October 2018; pp. 60–68. [Google Scholar]
  20. Ogawa, N.; Narumi, T.; Hirose, M. Object size perception in immersive virtual reality: Avatar realism affects the way we perceive. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 647–648. [Google Scholar]
  21. Volante, M.; Babu, S.V.; Chaturvedi, H.; Newsome, N.; Ebrahimi, E.; Roy, T.; Daily, S.B.; Fasolino, T. Effects of virtual human appearance fidelity on emotion contagion in affective inter-personal simulations. IEEE Trans. Vis. Comput. Graph. 2016, 22, 1326–1335. [Google Scholar] [CrossRef] [PubMed]
  22. MacDorman, K.F.; Green, R.D.; Ho, C.C.; Koch, C.T. Too real for comfort? Uncanny responses to computer generated faces. Comput. Hum. Behav. 2009, 25, 695–710. [Google Scholar] [CrossRef]
  23. Ring, L.; Utami, D.; Bickmore, T. The right agent for the job? In International Conference on Intelligent Virtual Agents; Springer: Cham, Switzerland, 2014; pp. 374–384. [Google Scholar]
  24. Doran, J.P.; Zucconi, A. Unity 2018 Shaders and Effects Cookbook: Transform Your Game into a Visually Stunning Masterpiece with over 70 Recipes; Packt Publishing Ltd.: Birmingham, UK, 2018. [Google Scholar]
  25. Nair, R. Using Raymarched Shaders as Environments in 3D Video Games; Drexel University: Philadelphia, PA, USA, 2020. [Google Scholar]
  26. Nordberg, J. Visual Effects for Mobile Games: Creating a Clean Visual Effect for Small Screens. Bachelor’s Thesis, University of Applied Sciences, Turku, Finland, 2020. [Google Scholar]
  27. Argelaguet, F.; Hoyet, L.; Trico, M.; Lécuyer, A. The role of interaction in virtual embodiment: Effects of the virtual hand representation. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 3–10. [Google Scholar]
  28. Koilias, A.; Mousas, C.; Anagnostopoulos, C.N. The effects of motion artifacts on self-avatar agency. Informatics 2019, 6, 18. [Google Scholar] [CrossRef]
  29. Kokkinara, E.; Slater, M.; López-Moliner, J. The effects of visuomotor calibration to the perceived space and body, through embodiment in immersive virtual reality. ACM Trans. Appl. Percept. (TAP) 2015, 13, 1–22. [Google Scholar] [CrossRef]
  30. Krogmeier, C.; Mousas, C.; Whittinghill, D. Human–virtual character interaction: Toward understanding the influence of haptic feedback. Comput. Animat. Virtual Worlds 2019, 30, e1883. [Google Scholar] [CrossRef]
  31. Fribourg, R.; Argelaguet, F.; Lécuyer, A.; Hoyet, L. Avatar and sense of embodiment: Studying the relative preference between appearance, control and point of view. IEEE Trans. Vis. Comput. Graph. 2020, 26, 2062–2072. [Google Scholar] [CrossRef] [PubMed]
  32. Gonzalez-Franco, M.; Peck, T.C. Avatar embodiment. towards a standardized questionnaire. Front. Robot. AI 2018, 5, 74. [Google Scholar] [CrossRef]
  33. Maselli, A.; Slater, M. The building blocks of the full body ownership illusion. Front. Hum. Neurosci. 2013, 7, 83. [Google Scholar] [CrossRef] [Green Version]
  34. Kokkinara, E.; Slater, M. Supplementary Material for:“Measuring the Effects through Time of the Influence of Visuomotor and Visuotactile Synchronous Stimulation on a Virtual Body Ownership Illusion”. Perception 2014, 43, NP1–NP4. [Google Scholar] [CrossRef]
  35. Kilteni, K.; Bergstrom, I.; Slater, M. Drumming in immersive virtual reality: The body shapes the way we play. IEEE Trans. Vis. Comput. Graph. 2013, 19, 597–605. [Google Scholar] [CrossRef]
  36. Peck, T.C.; Seinfeld, S.; Aglioti, S.M.; Slater, M. Putting yourself in the skin of a black avatar reduces implicit racial bias. Conscious. Cogn. 2013, 22, 779–787. [Google Scholar] [CrossRef]
  37. Sposito, A.V.; Bolognini, N.; Vallar, G.; Posteraro, L.; Maravita, A. The spatial encoding of body parts in patients with neglect and neurologically unimpaired participants. Neuropsychologia 2010, 48, 334–340. [Google Scholar] [CrossRef]
  38. Longo, M.R.; Haggard, P. An implicit body representation underlying human position sense. Proc. Natl. Acad. Sci. USA 2010, 107, 11727–11732. [Google Scholar] [CrossRef]
  39. Coelho, L.A.; Gonzalez, C.L. The visual and haptic contributions to hand perception. Psychol. Res. 2018, 82, 866–875. [Google Scholar] [CrossRef] [PubMed]
  40. Zibrek, K.; Kokkinara, E.; McDonnell, R. The effect of realistic appearance of virtual characters in immersive environments-does the character’s personality play a role? IEEE Trans. Vis. Comput. Graph. 2018, 24, 1681–1690. [Google Scholar] [CrossRef]
  41. Lin, L.; Normoyle, A.; Adkins, A.; Sun, Y.; Robb, A.; Ye, Y.; Di Luca, M.; Jörg, S. The effect of hand size and interaction modality on the virtual hand illusion. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 510–518. [Google Scholar]
  42. Moehring, M.; Froehlich, B. Effective manipulation of virtual objects within arm’s reach. In Proceedings of the 2011 IEEE Virtual Reality Conference, Osaka, Japan, 23–27 March 2011; pp. 131–138. [Google Scholar]
  43. Lin, J.; Schulze, J.P. Towards naturally grabbing and moving objects in VR. Electron. Imaging 2016, 2016, 1–6. [Google Scholar] [CrossRef]
  44. Porter III, J.; Boyer, M.; Robb, A. Guidelines on successfully porting non-immersive games to virtual reality: A case study in minecraft. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play, Melbourne, Australia, 28–31 October 2018; pp. 405–415. [Google Scholar]
  45. Lo, J.; Johansson, R.S. Regional differences and interindividual variability in sensitivity to vibration in the glabrous skin of the human hand. Brain Res. 1984, 301, 65–72. [Google Scholar]
  46. Merchel, S.; Altinsoy, M.E. Psychophysical comparison of the auditory and tactile perception: A survey. J. Multimodal User Interfaces 2020, 14, 271–283. [Google Scholar] [CrossRef]
  47. Gaudeni, C.; Meli, L.; Jones, L.A.; Prattichizzo, D. Presenting surface features using a haptic ring: A psychophysical study on relocating vibrotactile feedback. IEEE Trans. Haptics 2019, 12, 428–437. [Google Scholar] [CrossRef] [Green Version]
  48. Fröhner, J.; Salvietti, G.; Beckerle, P.; Prattichizzo, D. Can wearable haptic devices foster the embodiment of virtual limbs? IEEE Trans. Haptics 2018, 12, 339–349. [Google Scholar] [CrossRef]
  49. Moore, C.H.; Corbin, S.F.; Mayr, R.; Shockley, K.; Silva, P.L.; Lorenz, T. Grasping Embodiment: Haptic Feedback for Artificial Limbs. Front. Neurorobot. 2021, 15, 66. [Google Scholar] [CrossRef]
  50. Richard, G.; Pietrzak, T.; Argelaguet, F.; Lécuyer, A.; Casiez, G. Studying the Role of Haptic Feedback on Virtual Embodiment in a Drawing Task. Front. Virtual Real. 2021, 1, 28. [Google Scholar] [CrossRef]
  51. Gutiérrez, Á.; Sepúlveda-Muñoz, D.; Gil-Agudo, Á.; de los Reyes Guzman, A. Serious game platform with haptic feedback and EMG monitoring for upper limb rehabilitation and smoothness quantification on spinal cord injury patients. Appl. Sci. 2020, 10, 963. [Google Scholar] [CrossRef]
  52. McGregor, C.; Bonnis, B.; Stanfield, B.; Stanfield, M. Design of the ARAIG haptic garment for enhanced resilience assessment and development in tactical training serious games. In Proceedings of the 2016 IEEE 6th International Conference on Consumer Electronics-Berlin (ICCE-Berlin), Berlin, Germany, 5–7 September 2016; pp. 214–217. [Google Scholar]
  53. Cui, D.; Mousas, C. Evaluating Wearable Tactile Feedback Patterns During a Virtual Reality Fighting Game. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bari, Italy, 4–8 October 2021; pp. 328–333. [Google Scholar]
  54. Cui, D.; Kao, D.; Mousas, C. Toward understanding embodied human-virtual character interaction through virtual and tactile hugging. Comput. Animat. Virtual Worlds 2021, 32, e2009. [Google Scholar] [CrossRef]
  55. Koilias, A.; Mousas, C.; Anagnostopoulos, C.N. I feel a moving crowd surrounds me: Exploring tactile feedback during immersive walking in a virtual crowd. Comput. Animat. Virtual Worlds 2020, 31, e1963. [Google Scholar] [CrossRef]
  56. Krogmeier, C.; Mousas, C.; Whittinghill, D. Human, virtual human, bump! a preliminary study on haptic feedback. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1032–1033. [Google Scholar]
  57. Guterstam, A.; Gentile, G.; Ehrsson, H.H. The invisible hand illusion: Multisensory integration leads to the embodiment of a discrete volume of empty space. J. Cogn. Neurosci. 2013, 25, 1078–1099. [Google Scholar] [CrossRef]
  58. Fossataro, C.; Bruno, V.; Gindri, P.; Pia, L.; Berti, A.; Garbarini, F. Feeling touch on the own hand restores the capacity to visually discriminate it from someone else’hand: Pathological embodiment receding in brain-damaged patients. Cortex 2018, 104, 207–219. [Google Scholar] [CrossRef] [PubMed]
  59. Botvinick, M.; Cohen, J. Rubber hands ‘feel’ touch that eyes see. Nature 1998, 391, 756. [Google Scholar] [CrossRef]
  60. Ehrsson, H.H.; Spence, C.; Passingham, R.E. That is my hand! Activity in premotor cortex reflects feeling of ownership of a limb. Science 2004, 305, 875–877. [Google Scholar] [CrossRef] [Green Version]
  61. Tsakiris, M.; Haggard, P. The rubber hand illusion revisited: Visuotactile integration and self-attribution. J. Exp. Psychol. Hum. Percept. Perform. 2005, 31, 80. [Google Scholar] [CrossRef] [PubMed]
  62. Tsakiris, M.; Carpenter, L.; James, D.; Fotopoulou, A. Hands only illusion: Multisensory integration elicits sense of ownership for body parts but not for non-corporeal objects. Exp. Brain Res. 2010, 204, 343–352. [Google Scholar] [CrossRef] [PubMed]
  63. Farmer, H.; Maister, L.; Tsakiris, M. Change my body, change my mind: The effects of illusory ownership of an outgroup hand on implicit attitudes toward that outgroup. Front. Psychol. 2014, 4, 1016. [Google Scholar] [CrossRef]
  64. Pyasik, M.; Tieri, G.; Pia, L. Visual appearance of the virtual hand affects embodiment in the virtual hand illusion. Sci. Rep. 2020, 10, 5412. [Google Scholar] [CrossRef]
  65. Sanchez-Vives, M.V.; Spanlang, B.; Frisoli, A.; Bergamasco, M.; Slater, M. Virtual hand illusion induced by visuomotor correlations. PLoS ONE 2010, 5, e10381. [Google Scholar] [CrossRef]
  66. Brugada-Ramentol, V.; Clemens, I.; de Polavieja, G.G. Active control as evidence in favor of sense of ownership in the moving Virtual Hand Illusion. Conscious. Cogn. 2019, 71, 123–135. [Google Scholar] [CrossRef] [PubMed]
  67. Choi, W.; Li, L.; Satoh, S.; Hachimura, K. Multisensory integration in the virtual hand illusion with active movement. BioMed Res. Int. 2016, 2016, 8163098. [Google Scholar] [CrossRef]
  68. Qu, J.; Sun, Y.; Yang, L.; Hommel, B.; Ma, K. Physical load reduces synchrony effects on agency and ownership in the virtual hand illusion. Conscious. Cogn. 2021, 96, 103227. [Google Scholar] [CrossRef]
  69. Schatz, R.; Regal, G.; Schwarz, S.; Suettc, S.; Kempf, M. Assessing the QoE impact of 3D rendering style in the context of VR-based training. In Proceedings of the 2018 Tenth international conference on quality of multimedia experience (QoMEX), Cagliari, Italy, 29 May–1 June 2018; pp. 1–6. [Google Scholar]
  70. Torkhani, F.; Wang, K.; Chassery, J.M. Perceptual quality assessment of 3D dynamic meshes: Subjective and objective studies. Signal Process. Image Commun. 2015, 31, 185–204. [Google Scholar] [CrossRef]
  71. Lavoué, G.; Mantiuk, R. Quality assessment in computer graphics. In Visual Signal Quality Assessment; Springer: Cham, Switzerland, 2015; pp. 243–286. [Google Scholar]
  72. Kiefl, N.; Figas, P.; Bichlmeier, C. Effects of graphical styles on emotional states for VR-supported psychotherapy. In Proceedings of the 2018 10th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), Wurzburg, Germany, 5–7 September 2018; pp. 1–4. [Google Scholar]
  73. Zibrek, K.; Martin, S.; McDonnell, R. Is photorealism important for perception of expressive virtual humans in virtual reality? ACM Trans. Appl. Percept. (TAP) 2019, 16, 1–19. [Google Scholar] [CrossRef]
  74. Kao, D.; Magana, A.J.; Mousas, C. Evaluating Tutorial-Based Instructions for Controllers in Virtual Reality Games. Proc. ACM Hum.-Comput. Interact. 2021, 5, 1–28. [Google Scholar] [CrossRef]
  75. Lougiakis, C.; Katifori, A.; Roussou, M.; Ioannidis, I.P. Effects of virtual hand representation on interaction and embodiment in HMD-based virtual environments using controllers. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 510–518. [Google Scholar]
  76. Usoh, M.; Catena, E.; Arman, S.; Slater, M. Using presence questionnaires in reality. Presence 2000, 9, 497–503. [Google Scholar] [CrossRef]
  77. Gliem, J.A.; Gliem, R.R. Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for Likert-type scales. In Proceedings of the Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, Columbus, OH, USA; 2003. [Google Scholar]
  78. Croasmun, J.T.; Ostrom, L. Using likert-type scales in the social sciences. J. Adult Educ. 2011, 40, 19–22. [Google Scholar]
  79. Latoschik, M.E.; Roth, D.; Gall, D.; Achenbach, J.; Waltemate, T.; Botsch, M. The effect of avatar realism in immersive social virtual realities. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, Gothenburg Sweden, 8–10 November 2017; pp. 1–10. [Google Scholar]
  80. Lugrin, J.L.; Latt, J.; Latoschik, M.E. Anthropomorphism and Illusion of Virtual Body Ownership. In Proceedings of the ICAT-EGVE, Kyoto, Japan, 28–30 October 2015; pp. 1–8. [Google Scholar]
  81. Heinrich, C.; Cook, M.; Langlotz, T.; Regenbrecht, H. My hands? Importance of personalised virtual hands in a neurorehabilitation scenario. Virtual Real. 2021, 25, 313–330. [Google Scholar] [CrossRef]
  82. Thaler, A.; Piryankova, I.; Stefanucci, J.K.; Pujades, S.; de La Rosa, S.; Streuber, S.; Romero, J.; Black, M.J.; Mohler, B.J. Visual perception and evaluation of photo-realistic self-avatars from 3D body scans in males and Females. Front. ICT 2018, 5, 18. [Google Scholar] [CrossRef]
  83. Vargas, L.; Huang, H.H.; Zhu, Y.; Hu, X. Static and dynamic proprioceptive recognition through vibrotactile stimulation. J. Neural Eng. 2021, 18, 046093. [Google Scholar] [CrossRef]
  84. Cooper, N.; Milella, F.; Pinto, C.; Cant, I.; White, M.; Meyer, G. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment. PLoS ONE 2018, 13, e0191846. [Google Scholar] [CrossRef] [PubMed]
  85. Schecter, S.; Lin, W.; Gopal, A.; Fan, R.; Rashba, E. Haptics and the heart: Force and tactile feedback system for cardiovascular interventions. Cardiovasc. Revasc. Med. 2018, 19, 36–40. [Google Scholar] [CrossRef]
  86. D’Alonzo, M.; Clemente, F.; Cipriani, C. Vibrotactile stimulation promotes embodiment of an alien hand in amputees with phantom sensations. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 23, 450–457. [Google Scholar] [CrossRef] [PubMed]
  87. Haggard, P. Sense of agency in the human brain. Nat. Rev. Neurosci. 2017, 18, 196–207. [Google Scholar] [CrossRef] [PubMed]
  88. Bregman-Hai, N.; Kessler, Y.; Soffer-Dudek, N. Who wrote that? Automaticity and reduced sense of agency in individuals prone to dissociative absorption. Conscious. Cogn. 2020, 78, 102861. [Google Scholar] [CrossRef] [PubMed]
  89. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. The three different hand appearances we examined in this study (from left to right: realistic, pixelated, and toon hand appearance).
Figure 1. The three different hand appearances we examined in this study (from left to right: realistic, pixelated, and toon hand appearance).
Mti 06 00076 g001
Figure 2. The training (tutorial) and testing sections of our application. From left to right: the tutorial level, which was provided to participants to familiarize them with the operations and mechanics (left); the testing section, in which participants were asked to assemble the gears (middle); and the spike “piercing” the virtual hand (right).
Figure 2. The training (tutorial) and testing sections of our application. From left to right: the tutorial level, which was provided to participants to familiarize them with the operations and mechanics (left); the testing section, in which participants were asked to assemble the gears (middle); and the spike “piercing” the virtual hand (right).
Mti 06 00076 g002
Figure 3. A participant during the study in our lab setup.
Figure 3. A participant during the study in our lab setup.
Mti 06 00076 g003
Figure 4. Boxplots of all results for each examined condition. The vertical axis refers to the 7-point scale. Boxes enclose the middle 50% of the data. A thick horizontal line denotes the median.
Figure 4. Boxplots of all results for each examined condition. The vertical axis refers to the 7-point scale. Boxes enclose the middle 50% of the data. A thick horizontal line denotes the median.
Mti 06 00076 g004
Figure 5. Interaction plots (Appearance × Prior VR Experience) for the agency and motor control, and the response to external stimuli. The vertical axis refers to the 7-point scale.
Figure 5. Interaction plots (Appearance × Prior VR Experience) for the agency and motor control, and the response to external stimuli. The vertical axis refers to the 7-point scale.
Mti 06 00076 g005
Table 1. The statements/questions used in our study.
Table 1. The statements/questions used in our study.
No.VariableQuestions/Statements
1PresencePlease rate your sense of being in the virtual environment, on a scale of 1 to 7, where 7 represents your normal experience of being in a place.
2 To what extent were there times during the experience when the virtual environment was the reality for you? 1 indicates not at all, and 7 indicates totally.
3 During the time of the experience, which was the strongest on the whole, your sense of being in the virtual environment or of being elsewhere? 1 indicates being in the virtual environment, and 7 being elsewhere.
4 During the time of your experience, did you often think to yourself that you were actually in the virtual environment? 1 indicates strongly disagree, and 7 indicates strongly agree.
5Hand OwnershipI felt as if the virtual hand was my real hand. 1 indicates strongly disagree, and 7 indicates strongly agree.
6 It felt as if the virtual hand I saw was someone else’s. 1 indicates strongly disagree, and 7 indicates strongly agree.
7 It seemed as if I might have more than one hand. 1 indicates strongly disagree, and 7 indicates strongly agree.
8Tactile SensationIt seemed as if I felt the vibration of the equipment in the location where I saw the virtual hand touch the object. 1 indicates strongly disagree, and 7 indicates strongly agree.
9 It seemed as if the touch I felt was located somewhere between my physical hand and the virtual hand. 1 indicates strongly disagree, and 7 indicates strongly agree.
10 It seemed as if the touch I felt was caused by the hand touching the object. 1 indicates strongly disagree and 7 indicates strongly agree.
11 It seemed as if my hand was touching the gears. 1 indicates strongly disagree and 7 indicates strongly agree.
12Agency and Motor ControlIt felt like I could control the virtual hand as if it was my own hand. 1 indicates strongly disagree and 7 indicates strongly agree.
13 The movements of the virtual hands were caused by my movements. 1 indicates strongly disagree and 7 indicates strongly agree.
14 I felt as if the movements of the virtual hands were influencing my own movements. 1 indicates strongly disagree and 7 indicates strongly agree.
15 I felt as if the virtual hand was moving by itself. 1 indicates strongly disagree and 7 indicates strongly agree.
16External AppearanceIt felt as if my (real) hand were turning into an “avatar” hand. 1 indicates strongly disagree and 7 indicates strongly agree.
17 At some point, it felt as if my real hand was starting to take on the posture or shape of the virtual hand that I saw. 1 indicates strongly disagree and 7 indicates strongly agree.
18 At some point, it felt that the virtual hand resembled my own (real) hand in terms of shape, skin tone, or other visual features. 1 indicates strongly disagree and 7 indicates strongly agree.
19Response to External StimuliI felt a tactile sensation in my hand when I saw the spike coming out. 1 indicates strongly disagree and 7 indicates strongly agree.
20 When the spike came out, I felt the instinct to draw my hands out. 1 indicates strongly disagree and 7 indicates strongly agree.
21 I felt as if my hand had hurt. 1 indicates strongly disagree and 7 indicates strongly agree.
22 I had the feeling that I might be harmed by the spike. 1 indicates strongly disagree and 7 indicates strongly agree.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cui, D.; Mousas, C. Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback. Multimodal Technol. Interact. 2022, 6, 76. https://doi.org/10.3390/mti6090076

AMA Style

Cui D, Mousas C. Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback. Multimodal Technologies and Interaction. 2022; 6(9):76. https://doi.org/10.3390/mti6090076

Chicago/Turabian Style

Cui, Dixuan, and Christos Mousas. 2022. "Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback" Multimodal Technologies and Interaction 6, no. 9: 76. https://doi.org/10.3390/mti6090076

Article Metrics

Back to TopTop