Next Article in Journal
Exploring the Synergy of Artificial Intelligence in Energy Storage Systems for Electric Vehicles
Next Article in Special Issue
Implications for Serious Game Design: Quantification of Cognitive Stimulation in Virtual Reality Puzzle Games through MSC and SpEn EEG Analysis
Previous Article in Journal
Machine Learning-Based Hand Pose Generation Using a Haptic Controller
Previous Article in Special Issue
Neurogaming in Virtual Reality: A Review of Video Game Genres and Cognitive Impact
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Learning of 3D Model Unwrapping through Virtual Reality Serious Game: Design and Usability Validation

by
Bruno Rodriguez-Garcia
,
José Miguel Ramírez-Sanz
,
Ines Miguel-Alonso
and
Andres Bustillo
*
Departamento de Ingeniería Informática, Universidad de Burgos, 09006 Burgos, Spain
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(10), 1972; https://doi.org/10.3390/electronics13101972
Submission received: 18 April 2024 / Revised: 14 May 2024 / Accepted: 15 May 2024 / Published: 17 May 2024
(This article belongs to the Special Issue Serious Games and Extended Reality (XR))

Abstract

:
Given the difficulty of explaining the unwrapping process through traditional teaching methodologies, this article presents the design, development, and validation of an immersive Virtual Reality (VR) serious game, named Unwrap 3D Virtual: Ready (UVR), aimed at facilitating the learning of unwrapping 3D models. The game incorporates animations to aid users in understanding the unwrapping process, following Mayer’s Cognitive Theory of Multimedia Learning and Gamification principles. Structured into four levels of increasing complexity, users progress through different aspects of 3D model unwrapping, with the final level allowing for result review. A sample of 53 students with experience in 3D modeling was categorized based on device (PC or VR) and previous experience (XP) in VR, resulting in Low-XP, Mid-XP, and High-XP groups. Hierarchical clustering identified three clusters, reflecting varied user behaviors. Results from surveys assessing game experience, presence, and satisfaction show higher immersion reported by VR users despite greater satisfaction being observed in the PC group due to a bug in the VR version. Novice users exhibited higher satisfaction, which was attributed to the novelty effect, while experienced users demonstrated greater control and proficiency.

1. Introduction

Virtual Reality (VR) technology has undergone remarkable advancements in recent years, evolving from basic 3D visualizations to immersive experiences that enable user interaction within virtual environments. This progression, particularly in immersive VR, has found diverse applications across various domains, including medicine, industry, and education [1]. Notably, VR has shown its potential to improve learning outcomes [2,3], particularly in subjects requiring strong visual and spatial comprehension, such as geometry, 3D modeling, and computer science [4].
Amidst the integration of VR into educational settings, a new category of educational tools known as ‘VR serious games’ has emerged. These immersive experiences serve not only to educate but also to entertain, providing engaging platforms for reinforcing learning objectives. Among the myriad applications of VR serious games, one particularly promising area lies in elucidating complex concepts related to 3D modeling and texturing. For example, in [5], authors developed a VR application aimed at enhancing understanding of various views and modeling techniques within VR environments. Another study analyzed factors affecting usability in 3D modeling in VR [6], highlighting the importance of realistic objects for immersion, coherent size with reality, stability of objects, and inclusion of feedback mechanisms.
However, despite notable advancements, challenges persist, especially for intermediate users grappling with the intricacies of 3D modeling production, including the often-overlooked task of texturing meshes to achieve realistic visual representations. The texturing process requires users to manipulate the mesh of their 3D model to create textures that fit perfectly with each other, a task that demands spatial intelligence. While VR technology has shown the potential to improve spatial ability, challenges exist in designing experiences that foster learning without hindrance [7]. Additionally, other studies have highlighted the positive impact of VR on spatial ability improvement [8].
This research aims to address the aforementioned gap by introducing Unwrap 3D Virtual: Ready (UVR) an innovative VR serious game specifically designed to provide an immersive learning experience focused on mastering 3D model unwrapping modalities. By targeting this critical aspect of 3D modeling production, this study seeks to enhance the educational potential of VR technology and contribute to the advancement of VR-based learning tools. Through an initial assessment, the usability and user satisfaction of UVR was evaluated through a validation study involving 53 students. The focus of the research was to gauge user feedback regarding usability and satisfaction with the game interface rather than measuring learning outcomes directly. For this validation, users were split into two groups: one using UVR in VR and the other on PC. Additionally, the previous groups were divided into three groups based on their previous experience with VR in order to consider the novelty effect, which can affect satisfaction and learning outcomes [4]. Usability and satisfaction were assessed using the Game Experience Questionnaire (GEQ) through analyzing three modules: Game Core In-Game version (GEQ-IG), Post-Game module (GEQ-PG), and a Presence Questionnaire (PQ). These findings not only shed light on the user experience of UVR but also offer insights into the potential of VR serious games in enhancing user engagement and satisfaction in educational contexts.
The main focus of this research is on the optimization of VR experiences’ design for optimal usability. Additionally, the novelty effect was considered, as users unfamiliar with VR environments may exhibit different responses in usability and satisfaction surveys.
In the context of 3D modeling and unwrapping, an important aspect to consider is the concept of UVs (two-dimensional coordinate space of the 3D models). UV mapping involves the process of unwrapping a 3D model’s surface to lay it flat in 2D space, enabling textures to be applied accurately. UVs are crucial in ensuring that textures wrap seamlessly around a model, preserving its visual fidelity and realism. Understanding UV mapping principles and techniques is essential for achieving high-quality texture mapping in 3D modeling and is fundamental to the texturing process in computer graphics and virtual environments. Therefore, in the development of UVR, special attention was given to teaching users about UV mapping and its significance in texturing 3D models within the immersive environment.
This article is structured as follows: Section 2 collects the state of the art, Section 3 describes the methodology and materials of the research, while Section 4 presents the results. Section 5 discusses the results, and Section 6 concludes the paper.

2. State of the Art

Prior to this experiment, several VR applications have been developed to support conventional learning, particularly in abstract subjects such as mathematics. For example, Hsu developed a VR application aimed at bolstering learning in this domain [9], yielding positive outcomes in terms of learning efficacy. Similarly, the study of İbili found that Augmented Reality (AR) enhanced 3D thinking skills, contributing to the learning process [10]. These findings were further supported by Simonetti, who utilized a VR application to elucidate mathematical concepts [11]. In the field of medicine, Chen proposed a VR application to facilitate the learning of skull anatomy [12]. Through a comparative analysis, their study demonstrated the effectiveness of VR in improving motivation levels without compromising learning outcomes when compared to traditional methods, such as physical atlases. Another notable example of a VR serious game applied to Underwater Cultural Heritage is explored in [13], and it investigates the utility of a diving simulator to facilitate the reconstruction of excavations. Additionally, in the realm of archaeological research, Mixed Reality is used in [14] as a platform for a collaborative experimental system to visualize archaeological excavations, allowing multiple users to visualize the same space and take and share notes synchronously. Furthermore, [15] explores the potential of VR serious games and simulations, emphasizing the importance of knowledge transfer and the haptic feedback to enhance presence and immersion. The utility of these technologies for education is reinforced by [16], which studies the usability and the acceptance level of users when substituting conventional learning methods.
Considering the design of the serious game UVR, several design principles were considered to create a comfortable experience. Previous research included the Cognitive Theory of Multimedia Learning [17], which was adapted to a VR application [4]. This theory comprises principles aimed at enhancing the understanding of content, some of which can be applied to avoid distractions, guide attention, and reinforce information from different sources. Following the conclusions of the authors, the design of their application was successful. Additionally, in [18], the authors combined this theory with Gamification principles to improve motivation and engagement during gameplay. Gamification in VR enhances intrinsic motivation and engagement [19]. Following Faust’s research [20], various design frameworks for gamifying video games and applications have been developed, each suitable for different learning contexts. Storytelling has been employed as a Gamification technique to enhance user involvement, as demonstrated by Staneva [21]. Other research has adapted completion-contingent rewards into performance-contingent rewards in a game-based learning context, successfully improving motivation and learning outcomes [22]. Although numerous Gamification techniques exist, the study of Tiefenbacher compared techniques such as high scores and achievements, finding that they improved intrinsic motivation despite some users feeling a higher level of intuition with the VR environment and finding it easy to use, confirming that time-pressure was disruptive for users [23].
For evaluating the quality of games, one of the most well-known surveys is the GEQ, which includes three modules: GEQ-IG, GEQ-PG, and social presence [24]. This survey is commonly used for VR experiences, as demonstrated in the study by Boletsis, where locomotion in VR was measured using a combination of GEQ and the System Usability Scale (SUS) [25]. However, this approach was not entirely successful, as both questionnaires did not equally measure the user experience. Therefore, a merged questionnaire was proposed. Also, Vlahovic highlighted the need to combine GEQ with other measures to include specific items typical of VR, such as cybersickness [26].

3. Materials and Methods

In this section, the design of the serious game will be described. Firstly, the educational objectives will be exposed, accompanied by the theories and methodologies that were followed. Secondly, the designing process of each part of the serious game will be explained in detail, elucidating how this design has been adapted to the educational framework. PC and VR versions can be download at https://xrailab.es/cases/uvr/. Finally, the assessment method will be explained, along with the sample features that participated in the experiment.

3.1. Design of the Serious Game

The primary objective of the serious game UVR is to facilitate the learning of complex 3D concepts, such as unwrapping 3D models. To achieve this goal, UVR incorporates an additional graphical interface layer compared to the conventional 3D model unwrapping process, which is typically imperceptible to the user. To enhance the learning experience, UVR utilizes animations to illustrate the process of 3D model unwrapping. Figure 1 provides an example of one such animation. This process is represented in three phases. As depicted in Figure 1, the initial phase marks the commencement of unwrapping, followed by the intermediate stage, then culminating in the near completion of the process, where all faces are nearly overlapping.
UVR integrates various components to impart knowledge regarding the characteristics of different geometric unwrapping methods (cubic, cylindrical, and spherical), the unwrapping process utilizing seams, and general insights into 3D model unwrapping and UVs. The selection of these types is rooted in the fact that the 3D modeling software Blender incorporates them as default features. Given Blender’s open-source nature, it serves as an ideal complement to UVR for classroom instruction [27].
The distinguishing feature of UVR lies in its visualization of the 3D unwrapping process, aimed at harnessing the full potential of VR to enhance spatial concept learning [28]. Despite this, it is imperative that users devote utmost attention to the animations, as they facilitate a deeper understanding of the characteristics of each unwrapping type. To enhance this functionality, a game system based on the selection of 3D unwrapping has been implemented. Users can interact with various 3D models, each offering a choice among three types of unwrapping, with only one being the most accurate. The primary objective of this serious game is to enhance learning, requiring the unwrapping of every 3D model to complete the game. In order of importance, the following criteria have been identified as correct for solving the unwrapping:
  • The unwrapping must minimize distortion.
  • The number of islands into which it is divided should be relatively low, facilitating the identification of each.
  • The unwrapping should be as compact as possible, occupying the majority of the UV space available.
To enhance the effectiveness of the serious game, several Gamification elements have been integrated. Recognizing that the requirement to complete UVR entails executing the correct unwrapping of all 3D models, and given the limited three options, users might be inclined to make arbitrary selections in order to expedite their progress through the game. To address this potential issue, a reward system has been implemented, drawing upon common Gamification principles often employed in educational contexts [29]. UVR employs a scoring system to evaluate user performance: the more errors made, the lower the score awarded. This scoring system operates on a performance-contingent basis, meaning that points are earned based on the user’s performance during gameplay [30]. Furthermore, the rewards provided are intangible, existing solely within the context of the serious game itself [31]. This system effectively discourages users from randomly selecting 3D unwrapping options, as doing so could result in a deduction of points [22]. Moreover, it enables instructors to identify users who have made random selections, as these individuals are likely to exhibit a higher frequency of errors and a shorter duration of gameplay. Detailed explanations of the Gamification mechanics employed within UVR will be provided in Section 3.2.
To enhance the design of the experience, the principles of Mayer’s multimedia learning cognitive theory have been incorporated [17]. This theory advocates for 12 principles that are aimed at enhancing learning outcomes and reducing mental overload. Given the importance of these objectives in experiences integrating Gamification elements, the principles were adapted for use in UVR, which can be played on both 2D-desktop and VR platforms. All principles are described in the Cognitive Theory of Multimedia Learning by Mayer [17]. In total, 8 principles were integrated into UVR:
  • Segmenting principle: Educational content is divided into separate parts rather than presented as a unified whole.
  • Pre-training principle: Names and key features of the experience are provided prior to its commencement.
  • Coherence principle: Extraneous material that does not contribute to the experience is removed.
  • Signaling principle: Key elements are highlighted visually or aurally to enhance usability and experience.
  • Personalization principle: Learning is enhanced when the multimedia message is conveyed in a conversational style rather than formally.
  • Embodiment principle: This principle involves the implementation of human-like traits and elements, such as gestures, movements, or emotions.
  • Guided Discovery principle: Learning improves when a guide is incorporated during the experience.
  • Spatial Contiguity principle: This principle involves the simultaneous presentation of textual and graphical material.
In Section 3.2, the implementation of the principles of the multimedia learning cognitive theory will be thoroughly elucidated.

3.2. Development of the Serious Game

With the application of the design theories previously elucidated to the educational objective of UVR—specifically, enhancing understanding of 3D unwrapping through graphical visualization—a preliminary version of UVR has been developed. UVR is situated within a classroom environment where users are tasked with selecting the 3D unwrapping method that best suits the provided models, all while receiving visual assistance in the unwrapping process. To elucidate the functioning of the serious game and its alignment with Mayer’s principles and Gamification techniques, all elements of the serious game will be detailed, followed by an explanation of the phases of UVR. This sequence is chosen because the serious game maintains consistency in its elements across levels, featuring only variations in difficulty and unwrapping type. Subsequently, differences between the PC and VR platforms for UVR will be delineated.
UVR features a singular scenario encompassing all its elements. Each element will be thoroughly explained, including the rationale behind its inclusion and functionality, based on the principles of the multimedia learning cognitive theory and Gamification. To facilitate comprehension, Figure 2 offers an overhead view of the classroom environment, with each element being identified by a corresponding number (note that each number corresponds to the description of the element).
  • Classroom (Figure 2(1)): This is the primary environment in which UVR unfolds, occupying a space of 4 × 4 square meters. The scenario is intentionally adorned with minimal elements, featuring only two external components: a door and a window. Given the confined movement area, these elements were incorporated to impart a greater sense of spaciousness. This design decision aligns with the Coherence principle, as it simplifies the experience by removing non-essential elements that do not contribute to the educational objective. Additionally, the size of the environment has been reduced to facilitate movement among the various playable elements. Furthermore, the colors within the classroom are more subdued compared to the rest of the playable elements. This choice adheres to the Coherence principle by ensuring that superfluous elements are not emphasized. Moreover, it complies with the Signaling principle, as the playable elements are distinguished by their vibrant colors, enhancing visibility and guiding user attention.
  • Object Table (Figure 2(2)): This table serves as the platform for displaying the 3D models available for unwrapping. Users have the freedom to move around the environment, including the table. At the start of each level, all objects are positioned on the table. The number of 3D models varies depending on the level, with specific objects being chosen for each level, as detailed in the level explanations.
  • Unwrapping Table (Figure 2(3)): This table houses all the elements through which the user can control the unwrapping of 3D models. Figure 3 provides an overhead view of these controllers, with each corresponding letter representing its description in the text. These elements will be described below.
    Figure 3. Overhead view of the unwrapping table showcasing the unwrapping controllers: (a) the Plate, (b) the Unwrapping Buttons, (c) the Lever, (d) the Punctuation display, (e) the Advancing Level Button, and (f) the Clue Button.
    Figure 3. Overhead view of the unwrapping table showcasing the unwrapping controllers: (a) the Plate, (b) the Unwrapping Buttons, (c) the Lever, (d) the Punctuation display, (e) the Advancing Level Button, and (f) the Clue Button.
    Electronics 13 01972 g003
    • Plate (Figure 3a): The plate serves as a surface where users must place the object to initiate the unwrapping process. When the object is in close proximity to the plate, it will automatically be positioned on it if the user releases the object.
    • Unwrapping Buttons (Figure 3b): Three buttons are provided to select the unwrapping option when an object is placed on the plate. If the object is intended to be unwrapped using the three geometric projections (cube, cylinder, or sphere), the buttons will display an image representing each projection. However, if the object is intended to be unwrapped through seams, the images will be replaced with the numbers ‘1’, ‘2’, and ‘3’, each corresponding to a different version of the seam unwrapping. Figure 4 illustrates the two types of unwrapping buttons.
      Figure 4. Unwrapping buttons: (a) geometric shapes version, and (b) seams version.
      Figure 4. Unwrapping buttons: (a) geometric shapes version, and (b) seams version.
      Electronics 13 01972 g004
    • Lever (Figure 3c): Once users have selected the preferred unwrapping option for the 3D model, they can activate the lever to confirm their decision and determine its accuracy.
    • Punctuation display (Figure 3d): Depending on the unwrapping option chosen by the user, a corresponding addition or subtraction of points will be displayed in front of the plate. If the unwrapping is correct, 2 points will be added (displayed as ‘+2’ in green); otherwise, 1 point will be subtracted (displayed in red as ‘−1’), as depicted in Figure 5. This design choice aligns with Gamification principles by incorporating intangible rewards based on user performance. Providing this feedback visibly to the user helps discourage random selections. Additionally, the colors green and red were selected to denote reward and punishment, respectively. Lastly, the feedback punctuation (+2 and −1) was chosen to prevent users from making random selections. Since there are 3 options, an incorrect selection deducts half the points of a correct selection, ensuring that users who select randomly and only select one of the 3 unwrapping options correctly will have their total points deducted.
      Figure 5. Punctuation display: (a) incorrect option, and (b) correct option.
      Figure 5. Punctuation display: (a) incorrect option, and (b) correct option.
      Electronics 13 01972 g005
    • Advancing Level Button (Figure 3e): Once all objects in a level have been unwrapped, a red button will appear on the unwrapping table. This button allows users to progress to the next level. This design adheres to the Segmenting principle, enabling the division of the serious game into distinct units.
    • Clue Button (Figure 3f): When a user selects an incorrect unwrapping option and interacts with the lever, a green button will appear on the unwrapping table. This green button enables the user to request a clue. The provision of these clues aligns with the Guided Discovery principle, as providing guidance during gameplay enhances learning. Upon pressing this button, a robot will appear, delivering key information to assist the user.
  • Robot (Figure 2(4)): When users press the clue button, a robot will appear to provide clues. This robot also appears when all the 3D models are correctly unwrapped, indicating the availability of the red button to advance to the next level. The robot communicates through speech bubbles, and its appearance is accompanied by a distinctive sound to notify the user. An example of the robot interacting is displayed in Figure 6.
  • Tablet (Figure 2(5)): This device appears when the user selects one of the available unwrapping types. The tablet features a slider that, when moved to the right, plays the animation frame by frame and, when moved to the left, reverses the animation frame by frame. When the slider is released, the animation pauses on the current frame, allowing for detailed observation of the unwrapping process. Additionally, the tablet can be moved around the scenario to provide a better viewing angle of the animation. It can also be placed at any point in the scenario for user convenience, as the device’s physics are not activated.
  • The Path of the Unwrapping (Figure 2(6)): Although the path itself is not a tangible element, its consideration is crucial due to its significance in visualizing the unwrapping process. To optimize visibility, the animation’s path follows a diagonal direction relative to the space’s center, where the user typically resides. This orientation allows the user to closely observe the process from various viewpoints.
  • Blackboard (Figure 2(7)): Positioned in front of the unwrapping table, this surface serves as the viewing area for the fully unwrapped 3D model, as it is where the unwrapping path concludes. The completed unwrapping is highlighted within a frame on the blackboard. To convey additional information effectively, the Spatial Contiguity principle has been considered. As such, the blackboard displays the directions of U and V (the coordinates of the 2D textures), combining text and animation seamlessly.
Figure 6. The robot communicates with the user.
Figure 6. The robot communicates with the user.
Electronics 13 01972 g006
In addition, UVR consists of multiple levels that adhere to the Segmenting principle, with each level featuring distinct content. Despite the varying content types, all levels occur within the same scenario, and the elements previously explained are shared across all levels. The structural overview of UVR can be observed in Figure 7.
The levels are described below, including their educational purposes, game elements, and the types of 3D models to be unwrapped.
  • Level 0—Tutorial: This level serves to introduce users to the game mechanics. Following the Pre-training principle, the experience is described before initiation. All design elements are maintained in this level. Users encounter four easily recognizable 3D models: a Rubik’s cube representing cubic projections, a spear representing cylindrical projections, an apple representing spherical projections, and an organic figure (the head of a monkey) representing seam projections.
  • Level 1—Geometric Projections: The objective of this level is to acquaint users with geometric projections. All elements remain unchanged, and users encounter five 3D models representing the three geometric projections: cubic, cylindrical, and spherical.
  • Level 2—Seam Projections: This level aims to familiarize users with seam projections. All elements are retained, featuring two objects to be unwrapped using seams. Each object presents three unwrapping options, with only one being correct. The other two options demonstrate the issues that can arise with these types of unwrapping.
  • Level 3—Final Test: The goal of this level is to assess the user’s knowledge by presenting a challenging test. All elements are retained, albeit with fewer clues, and users encounter seven 3D models to be unwrapped that are more difficult than those in previous levels, evaluating the extent of learning improvement. Four of these objects pertain to geometric projections, while the remaining three represent seam projections.
  • Level 4—Results: In this level, users are not required to perform any actions. Instead, they are instructed to observe the blackboard, where their performance for each level is displayed (including the number of correct answers and mistakes) along with a star rating. This level exemplifies one of the core Gamification principles [32].
Given that the primary learning objective is the visualization of the 3D model unwrapping process, UVR has been designed to accommodate two versions: PC and VR.
  • PC version: The PC experience offers a diminished sense of three-dimensionality due to its 2D visualization and wider field of view (FOV). Consequently, users may find it less necessary to navigate the scenario extensively. Control is facilitated through the keyboard and mouse.
  • VR version: In contrast, the VR version offers a heightened sense of three-dimensionality, leveraging head-mounted displays (HMDs) and providing a greater perception of scale with a wider FOV in recent generations [33]. Consequently, users are encouraged to explore the scenario more extensively to interact with objects. This version incorporates teleport locomotion, a common feature in VR experiences [34].
However, during UVR’s development, an unresolved bug hindered the visualization of the 3D model unwrapping animation at certain points. This issue was particularly pronounced in the VR version due to the increased movement requirements resulting from the augmented sense of scale.
UVR was developed using Unreal Engine 5.1 and Blender 3.6.0 LTS. These software platforms were selected for their cost-effectiveness and capacity to create photorealistic elements [35]. Additionally, the research group possessed prior experience in developing similar VR projects [4].

3.3. Design and Validation of the Game Experience Survey

In order to validate this serious game, both a pre-test before the experience and a usability test after the experience were conducted. This methodological approach yields robust results compared to solely administering a test [36].
The pre-test involved a brief questionnaire designed to assess users’ prior experience with VR and their level of familiarity with it. Due to the high importance of cybersickness in VR experiences, as other studies have shown [37], users were observed during the experience for symptoms. Once the game was over, they were asked whether or not they had experienced cybersickness. For the post-test, the chosen instruments were the GEQ-IG, GEQ-PG [38], and the PQ [39]. The GEQ is widely recognized as a standard tool for assessing video games and serious games [40], as it encompasses key modules with specific questions regarding users’ experiences during gameplay (the GEQ-IG) and afterward GEQ-PG. Notably, this questionnaire offers two versions, with the GEQ-IG version selected for its brevity to minimize the likelihood of random or less-accurate responses from users, who were required to complete three questionnaires. Additionally, this questionnaire includes a Social Presence Module, typically utilized in experiences involving user collaboration. However, since UVR is an individual serious game, social presence was deemed irrelevant [41]. Therefore, instead of the Social Presence Module, the PQ was employed to evaluate individual presence and technical aspects related to user interaction, such as controllers. This technical evaluation was deemed essential for usability assessment, as it was conducted across two devices, PC and VR, with significant findings being reported in other studies [4].
Regarding the elements evaluated and the number of questions, the GEQ-IG assesses the following competencies (with two questions measuring each one):
  • Competence: This pertains to the user’s sense of comfort with the experience controls, akin to Computer Self-Efficacy (CSE) [42].
  • Sensory and Imaginative Immersion: This measures the user’s interest in the serious game, often associated with the sense of immersion “in the game” [43].
  • Flow: This assesses the user’s level of concentration and engagement with the actions [44].
  • Tension: This gauges the user’s experience of tension and difficulty in interacting with the serious game.
  • Challenge: This evaluates the user’s perception of being challenged by the serious game [45].
  • Negative affect: This identifies negative emotional states experienced by the user.
  • Positive affect: This identifies positive emotional states experienced by the user.
The GEQ-PG evaluates the following competences:
  • Positive affect: This component identifies positive emotional states experienced by the user [45]. Six questions are used to measure this competency.
  • Negative affect: This aspect identifies negative emotional states [45]. Six questions are employed to assess this competency.
  • Returning: This refers to the sense of “returning from a journey” once the experience has concluded. It was evaluated using three questions.
  • Tiredness: This evaluates the user’s feelings of tiredness upon completion of the experience. Two items were used to assess this aspect.
The PQ evaluates users’ subjective sense of ‘being’ in the virtual environment. It comprises the following competencies:
  • Control factors: This component assesses the extent of control the user has over the environment. Fourteen questions are employed to evaluate this aspect.
  • Sensory factors: This competency evaluates the quantity of sensory information perceived by the user. Ten questions are used to measure this aspect.
  • Distraction factors: This assesses the user’s susceptibility to distraction from external stimuli, taking into account technological aspects such as the type of HMDs. Seven questions are utilized to evaluate this competency.
  • Realism factors: These factors gauge the level of realism in the virtual environment, considering its consistency and continuity. Six questions are included to evaluate this aspect.
Additional information regarding these questionnaires can be found in [38,39].

3.4. Evaluation with Final Users

For the assessment of the experience, the sample consisted of 53 students enrolled in the second course of the Video Games Design University Degree at Burgos University, as well as students pursuing the 3D Animation, Video Games, and Interactive Environments Vocational Degree. All students had received some instruction of 3D modeling and/or texturing.
Prior to the experience (XP), all students completed the pre-test. Based on the results, three groups were established: Low-XP (n = 11), comprising students with no prior experience in VR; Mid-XP (n = 29), consisting of students with limited VR experience (they had participated in a previous study conducted by the research group); and High-XP (n = 13), comprising students who frequently engage in VR video games. Considering the students’ experience levels, the sample was divided into PC and VR groups, with 25 students participating in the PC group and 28 in the VR group. This ensured that both groups were balanced in terms of VR experience and contained the minimum number required for the study [46], as illustrated in Figure 8.
This balance between the groups aimed to provide insight into the functionality of UVR, with secondary groups in each category being created to compare the experience across different versions and levels of VR experience. Given the limited number of students in each group, significant differences were not sought. However, a robust evaluation was conducted using hierarchical clustering to compare the results of the groups created within the clusters. This is a method for forming hierarchical groups of mutual exclusive subsets [47]. Unsupervised learning is a machine learning paradigm where no classes are given; models attempt to learn patterns from data without explicit guidance. Clustering is one of the most typical unsupervised learning problems, and it involves models trying to group instances based on the values of their features [48].
This analysis led to the division of three principal clusters, as detailed in Section 4.
Before commencing the experience, the functionality of the serious game was explained. The experimental setup consisted of three workstations equipped with Intel Core i7-10710U processors, 32GB RAM, and NVIDIA GTX 2080 graphics cards. The HMDs used were the Oculus Meta Quest 2 and Meta Quest Pro, along with their VR hand-controllers. Upon completion of the experience, all users completed the questionnaires in the following order: GEQ-IG, GEQ-PG, and PQ.

4. Results

In this section, the results obtained from the three questionnaires will be explained. These results are divided according to the evaluation groups and the clusters (refer to Supplementary Materials for more information). Results on cybersickness have not been taken into account, as no user suffered from this issue.
Firstly, the division of the three clusters will be explained. Secondly, the results of each group will be exposed and also displayed in radar plots to ease the understanding. The hierarchical clustering analysis provided the following three clusters: Cluster 1, composed of 12 users, Cluster 2, composed of 24 users, and Cluster 3, composed of 17 users.
For a better understanding of the clusters, the relationship between users and groups was considered based on the experience device (PC or VR) and their degree of previous experience in VR (Low-XP, Mid-XP, and High-XP). This distribution can be observed in Figure 9. The X-axis displays the distribution, by percentage, of clusters based on their visualization device (left) and VR experience (right). Each cluster is represented on the Y-axis. The genders of the users were also considered in this cluster analysis, but no significant differences were found. The total percentage of women in the sample was 26%, making up 25%, 21%, and 35% of each cluster, respectively.
Considering this distribution as a reference, Clusters 1 and 2 exhibit a balanced distribution concerning users’ previous experience; in both cases, the percentage of users with Mid-XP is 58%, while Low and Mid-XP users account for approximately 20%. These percentages are not extraordinary, as Mid-XP users represent 55% of the total, while Low-XP users account for 21% and High-XP users for 25%. Therefore, these two clusters cannot be characterized by users’ VR experience. However, they differ in terms of visualization devices. Cluster 1 predominantly consists of VR users (83%), while Cluster 2 comprises more PC users (67%). These percentages are significant, as PCs represent 47% of the total users and VR represents 53%, indicating a balanced distribution. Conversely, Cluster 3 demonstrates a similar distribution between PC (41%) and VR (59%) users, with a higher proportion of VR users, as does the total sample. Conversely, Cluster 3 shows a more equitable distribution in terms of previous VR experience (with fewer Mid-XP users, 47%) and having a higher percentage of Low and High-XP users. Thus, Cluster 1 differs in that it consists mainly of VR users, Cluster 2 of PC users, and Cluster 3 of a mix of Low-XP and High-XP users with a balance in the display device. This analysis of cluster composition is of great importance for the study, as it allows for comparing the behavior of the groups formed in the study (based on users’ visualization device and experience) with the clusters, as each of these is distinguished by the predominant visualization device or the users’ level of experience.
Regarding the survey results, the data from the GEQ-IG are presented in Table 1, with scores ranging from 0 to 4. Figure 10 illustrates the mean scores of each group as radar plots, with an expanded scale from 0 to 3.5 to better discern differences between groups. Additionally, Figure 10b depicts the deviation percentages between the Low-XP and High-XP groups based on the 100% mean score of the Mid-XP group, facilitating a clear comparison between the groups with the least and most previous experience. The scale of Figure 10b ranges from 50% to 150%.
The results of the GEQ-PG are detailed in Table 2, with scores ranging from 0 to 4. Radar plots in Figure 11 depict these scores on a scale from 0 to 3.5, with Figure 11b illustrating deviation percentages from the Mid-XP group (ranging from 50% to 150%).
Finally, the results of the PQ are presented in Table 3, with their visual representation being depicted in Figure 12.

5. Discussion

In this section, firstly the discussion of the results with the GEQ-IG, the GEQ-PG, and the PQ will be presented firstly, with the main conclusions drawn from this analysis being presented secondly. Since the SD was similar in all the results (around 1), it was not taken into account as a factor in the analysis.
Table 1 and Figure 10 present the results of the GEQ-IG. The experience was equally challenging for both PC and VR groups, as indicated by similar Challenge ratings. However, the interaction differed between the groups, with the PC group reporting higher Competence (2.96 vs. 2.57) and lower Tension (0.54 vs. 1.13), suggesting that the experience was easier and more satisfying to use for this group. These higher values related to ease of use may be attributed to the bug in the serious game, which was less noticeable on the PC device. In contrast, the Immersion and Flow values were higher in the VR group (2.46 vs. 1.94 and 3.32 vs. 2.78), indicating a higher level of concentration among VR users. This aligns with common results found in comparisons between PC and VR experiences [49]. Both groups reported significantly high Positive Affect values (3.16 and 3.05) and low values of Negative Affect (0.36 and 0.63), suggesting that the experience was satisfying for both groups (albeit slightly better for the PC group), possibly due to the lesser impact of the bug.
Moreover, differences were observed among the previous VR experience groups. The Low-XP Group exhibited better results compared to the Mid-XP group in terms of positive competencies. Negative Affect and Tension were low in this group (Negative Affect: 0.32 vs. 0.57 and 0.50, and Tension: 0.50 vs. 0.97 and 0.88), while Challenge was higher (2.14 vs. 1.97 and 1.73), possibly indicating a novelty effect wherein some users experienced greater immersion and satisfaction [50]. This effect has been studied in other VR experiences, yielding similar results [51]. Similar to expert players, the High-XP Group reported higher satisfaction across all positive competencies (with Flow being highlighted at 2.50) and lower negative competencies. This group considered the experience less challenging, likely due to their expertise in interacting with such systems [4]. In the case of the clusters, they reflected the trends observed in the corresponding groups, amplifying existing differences. Notably, Cluster 1 (mainly composed of VR users) exhibited the highest values in Flow and Tension, Cluster 2 (mainly composed of PC users) showed higher Competence, and Cluster 3 (mainly composed of Low-XP and High-XP users) reported higher positive competencies and lower negative ones.
Moving on to the GEQ-PG results (Table 2 and Figure 11), PC users found the experience less immersive, with lower Returning rates (0.80 vs. 1.85) and Tiredness scores (0.48 vs. 0.96) compared to the VR group. However, their satisfaction was slightly higher, as indicated by the higher Positive Affect (2.40 vs. 1.89) and lower Negative Affect (0.52 vs. 0.76). Results of the GEQ-PG aligned with the GEQ-IG. The Low-XP group exhibited a higher Positive Affect but lower scores in other competencies, likely due to the novelty effect. Conversely, the High-XP group displayed typical expert player behavior, reporting lower Tiredness (0.54 vs. 0.68 and 0.84), similar Returning rates to Mid-XP, and greater satisfaction (though lower than the Low-XP group). However, the High-XP group reported a slightly higher Negative Affect, possibly due to comparisons with previous experiences, featuring a minor novelty effect [52]. In conclusion, it was observed that clusters exhibited the extreme behaviors of the predominantly formed groups. Cluster 1 showed good satisfaction but lower immersion. Cluster 2 exhibited higher scores in returning and Tiredness but a greater Negative Affect. Cluster 3 demonstrated high satisfaction and immersion due to the expertise of users and the novelty effect.
Lastly, PQ results are presented in Table 3 and Figure 12. While PC and VR groups showed similar results, differences emerged concerning the degree of VR experience. The Low-XP Group scored highest across all competencies (with the High-XP group outperforming only in Control and Sensory Factor). The High-XP group’s higher Control Factor score (2.47 vs. 2.40 and 2.29) can be attributed to their greater experience, enabling better control of the serious game. Also, Cluster 3 exhibited higher scores in all competencies, thereby accentuating the behavior of the Low- and High-XP groups.
Overall, the results demonstrated higher scores in positive competencies, such as Positive Affect, comparable to other experiences utilizing similar questionnaires [53,54,55]. Differences were observed among groups, with patterns indicating greater immersion in the VR group but lower satisfaction compared to the PC group, possibly due to the more noticeable bug in the VR experience. Additionally, the Low-XP and High-XP groups reported higher scores in positive competencies, possibly attributed to the novelty effect and extensive experience, respectively. These findings align with previous research demonstrating increased immersion and the novelty effect in VR experiences [28,56]. Lastly, differences were magnified within the cluster groups, highlighting distinct patterns based on the serious game device and users’ previous VR experience levels.

Limitations

As a limitation of this study, it should be noted that the bug was more noticeable in the VR group. This bug necessitated users to keep the animation’s origin within their field of view; otherwise, the entire animation would disappear. Due to the nature of VR, establishing an appropriate angle to maintain visibility of both the animation and its origin was more complex. Consequently, this heightened the visibility of the bug within this group.

6. Conclusions

In this research, the design and evaluation of Unwrap 3D Virtual: Ready (UVR) are elucidated, aiming to enhance learning about 3D model unwrapping. The principal innovation of this serious game lies in the incorporation of animations, enabling users to comprehend the unwrapping process and understand its functionality. The design of this experience adhered to Mayer’s Cognitive Theory of Multimedia Learning and Gamification principles, resulting in a structured experience comprising four levels. Each level focused on different aspects of 3D model unwrapping, progressively increasing in difficulty, with the final level being dedicated to result review. Within these levels, users were required to select from three unwrapping options, ensuring correct unwrapping of every 3D model in order to progress. Users were encouraged to pay attention to animations, and they could request clues by pressing the green button if needed. Level 0 served as an introduction to the control system and basic concepts of geometric and seam projections, while Level 1 and 2 delved into the advanced concepts of geometric and seam projections, respectively. Level 3 presented a final challenge without clues, encompassing both geometric and seam projections. UVR concluded with Level 4, allowing users to review their results.
The validation of this serious game involved a sample of 53 students being assessed using the GEQ-IG, the GEQ-PG, and the PQ. To ensure robust evaluation, the sample was categorized based on the device used (PC or VR) and their previous experience in VR, resulting in comparisons between PC and VR and Low-XP, Mid-XP, and High-XP groups. Hierarchical clustering yielded three clusters. The results indicate high satisfaction, with greater satisfaction being observed in the PC group due to a bug affecting VR users more severely. Despite this, VR users reported higher immersion. Furthermore, differences were observed between the Low-XP and High-XP groups, with both experiencing higher satisfaction attributed to the novelty effect and increased experience. These findings were reinforced by the extreme behaviors exhibited by the clusters, reflecting the broader sample behavior.
Future research endeavors include testing the serious game in a learning context to assess its educational efficacy. This will entail further refinement of the design, although initial satisfaction results have been promising. Additionally, efforts will focus on improving the VR version to address the bug issue and enhance usability. This is essential for comparing the impact of the bug against a non-buggy experience, as it can yield interesting insights [57]. Lastly, a more extensive tutorial will be introduced at the outset to mitigate the novelty effect, potentially impacting user performance.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics13101972/s1, Table S1: Data extracted from the surveys in Excel file format.

Author Contributions

Conceptualization, B.R.-G. and J.M.R.-S.; methodology, B.R.-G.; software (Unreal Engine 5.1, and Blender 3.6), I.M.-A., B.R.-G. and J.M.R.-S.; validation, I.M.-A., B.R.-G. and J.M.R.-S.; formal analysis, B.R.-G.; investigation, B.R.-G. and A.B.; resources, I.M.-A.; data curation, B.R.-G.; writing—original draft preparation, B.R.-G.; writing—review and editing, J.M.R.-S. and I.M.-A.; visualization, A.B.; supervision, A.B.; project administration, A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Consejeria de Educacion of the Junta de Castilla y Leon and the European Social Fund through a pre-doctoral grant (EDU/875/2021), and by the Ministry of Science, Innovation and Universities (FPU21/01978).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of the Universidad de Burgos (protocol code IR-14/2022 and date of approval 1 December 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Guillen-Sanz, H.; Checa, D.; Miguel-Alonso, I.; Bustillo, A. A Systematic Review of Wearable Biosensor Usage in Immersive Virtual Reality Experiences. Virtual Real. 2024, 28, 74. [Google Scholar] [CrossRef]
  2. Marougkas, A.; Troussas, C.; Krouska, A.; Sgouropoulou, C. Virtual Reality in Education: A Review of Learning Theories, Approaches and Methodologies for the Last Decade. Electronics 2023, 12, 2832. [Google Scholar] [CrossRef]
  3. Lee, L.-K.; Wei, X.; Chui, K.T.; Cheung, S.K.S.; Wang, F.L.; Fung, Y.-C.; Lu, A.; Hui, Y.K.; Hao, T.; U, L.H.; et al. A Systematic Review of the Design of Serious Games for Innovative Learning: Augmented Reality, Virtual Reality, or Mixed Reality? Electronics 2024, 13, 890. [Google Scholar] [CrossRef]
  4. Miguel-Alonso, I.; Checa, D.; Guillen-Sanz, H.; Bustillo, A. Evaluation of the Novelty Effect in Immersive Virtual Reality Learning Experiences. Virtual Real. 2024, 28, 27. [Google Scholar] [CrossRef]
  5. Huang, H.; Lin, C.; Cai, D. Enhancing the Learning Effect of Virtual Reality 3D Modeling: A New Model of Learner’s Design Collaboration and a Comparison of Its Field System Usability. Univers. Access Inf. Soc. 2021, 20, 429–440. [Google Scholar] [CrossRef]
  6. Huang, H.; Lee, C.-F. Factors Affecting Usability of 3D Model Learning in a Virtual Reality Environment. Interact. Learn. Environ. 2022, 30, 848–861. [Google Scholar] [CrossRef]
  7. Abich, J.; Parker, J.; Murphy, J.S.; Eudy, M. A Review of the Evidence for Training Effectiveness with Virtual Reality Technology. Virtual Real. 2021, 25, 919–933. [Google Scholar] [CrossRef]
  8. Safadel, P.; White, D. Effectiveness of Computer-Generated Virtual Reality (VR) in Learning and Teaching Environments with Spatial Frameworks. Appl. Sci. 2020, 10, 5438. [Google Scholar] [CrossRef]
  9. Hsu, Y.-C. Exploring the Learning Motivation and Effectiveness of Applying Virtual Reality to High School Mathematics. Univers. J. Educ. Res. 2020, 8, 438–444. [Google Scholar] [CrossRef]
  10. İbili, E.; Çat, M.; Resnyansky, D.; Şahin, S.; Billinghurst, M. An Assessment of Geometry Teaching Supported with Augmented Reality Teaching Materials to Enhance Students’ 3D Geometry Thinking Skills. Int. J. Math. Educ. Sci. Technol. 2020, 51, 224–246. [Google Scholar] [CrossRef]
  11. Simonetti, M.; Perri, D.; Amato, N.; Gervasi, O. Teaching Math with the Help of Virtual Reality. In Proceedings of the International Conference on Computational Science and Its Applications, Cagliari, Italy, 1–4 July 2020; pp. 799–809. [Google Scholar]
  12. Chen, S.; Zhu, J.; Cheng, C.; Pan, Z.; Liu, L.; Du, J.; Shen, X.; Shen, Z.; Zhu, H.; Liu, J.; et al. Can Virtual Reality Improve Traditional Anatomy Education Programmes? A Mixed-Methods Study on the Use of a 3D Skull Model. BMC Med. Educ. 2020, 20, 395. [Google Scholar] [CrossRef]
  13. Plecher, D.A.; Keil, L.; Kost, G.; Fiederling, M.; Eichhorn, C.; Klinker, G. Exploring Underwater Archaeology Findings with a Diving Simulator in Virtual Reality. Front. Virtual Real. 2022, 3, 901335. [Google Scholar] [CrossRef]
  14. Benko, H.; Ishak, E.W.; Feiner, S. Collaborative Mixed Reality Visualization of an Archaeological Excavation. In Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 2–5 November 2004; pp. 132–140. [Google Scholar]
  15. Eichhorn, C.; Plecher, D.; Klinker, G. VR Enabling Knowledge Gain for the User (VENUS). In Proceedings of the 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, Singapore, 17–21 October 2022. [Google Scholar]
  16. Zengerle, T.; Plecher, D.A.; Flegr, S.; Kuhn, J.; Fischer, M.R. Teaching Optical Principles in XR. In INFORMATIK 2023—Designing Futures: Zukünfte Gestalten; Gesellschaft für Informatik e.V.: Bonn, Germany, 2023. [Google Scholar] [CrossRef]
  17. Mayer, R.E. Cognitive Theory of Multimedia Learning. In The Cambridge Handbook of Multimedia Learning; Mayer, R., Ed.; Cambridge University Press: Cambridge, UK, 2014; pp. 43–71. [Google Scholar]
  18. Miguel-Alonso, I.; Guillen-Sanz, H.; Rodriguez-Garcia, B.; Bustillo, A. Design and Development of a Gamified Tutorial for IVR Serious Games. Eur. Conf. Games Based Learn. 2023, 17, 411–417. [Google Scholar] [CrossRef]
  19. Pande, P.; Thit, A.; Sørensen, A.E.; Mojsoska, B.; Moeller, M.E.; Jepsen, P.M. Long-Term Effectiveness of Immersive VR Simulations in Undergraduate Science Learning: Lessons from a Media-Comparison Study. Res. Learn. Technol. 2021, 29, 2482. [Google Scholar] [CrossRef]
  20. Faust, A. The Effects of Gamification on Motivation and Performance; Springer Fachmedien Wiesbaden: Wiesbaden, Germany, 2021; ISBN 978-3-658-35194-6. [Google Scholar]
  21. Staneva, A.; Ivanova, T.; Rasheva-Yordanova, K.; Borissova, D. Gamification in Education: Building an Escape Room Using VR Technologies. In Proceedings of the 2023 46th MIPRO ICT and Electronics Convention (MIPRO), Opatija, Croatia, 22–26 May 2023; pp. 678–683. [Google Scholar]
  22. Park, J.; Kim, S.; Kim, A.; Yi, M.Y. Learning to Be Better at the Game: Performance vs. Completion Contingent Reward for Game-Based Learning. Comput. Educ. 2019, 139, 1–15. [Google Scholar] [CrossRef]
  23. Tiefenbacher, F. Evaluation of Gamification Elements in a VR Application for Higher Education. In Proceedings of the Systems, Software and Services Process Improvement: 27th European Conference, EuroSPI 2020, Düsseldorf, Germany, 9–11 September 2020; pp. 830–847. [Google Scholar]
  24. Tao, G.; Garrett, B.; Taverner, T.; Cordingley, E.; Sun, C. Immersive Virtual Reality Health Games: A Narrative Review of Game Design. J. Neuroeng. Rehabil. 2021, 18, 31. [Google Scholar] [CrossRef]
  25. Boletsis, C. A User Experience Questionnaire for VR Locomotion: Formulation and Preliminary Evaluation. In Proceedings of the International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Lecce, Italy, 7–10 September 2020; pp. 157–167. [Google Scholar]
  26. Vlahovic, S.; Suznjevic, M.; Skorin-Kapov, L. A Survey of Challenges and Methods for Quality of Experience Assessment of Interactive VR Applications. J. Multimodal User Interfaces 2022, 16, 257–291. [Google Scholar] [CrossRef]
  27. Cai, L.; Yang, G. Development and Practice of Virtual Experiment Platform Based on Blender and Html5-Taking Computer Assembly and Maintenance as an Example. J. Phys. Conf. Ser. 2020, 1601, 032034. [Google Scholar] [CrossRef]
  28. Checa, D.; Bustillo, A. Advantages and Limits of Virtual Reality in Learning Processes: Briviesca in the Fifteenth Century. Virtual Real. 2020, 24, 151–161. [Google Scholar] [CrossRef]
  29. Kim, J.; Castelli, D.M. Effects of Gamification on Behavioral Change in Education: A Meta-Analysis. Int. J. Environ. Res. Public Health 2021, 18, 3550. [Google Scholar] [CrossRef]
  30. Neupane, A.; Hansen, D.; Fails, J.A.; Sharma, A. The Role of Steps and Game Elements in Gamified Fitness Tracker Apps: A Systematic Review. Multimodal Technol. Interact. 2021, 5, 5. [Google Scholar] [CrossRef]
  31. Xiao, Y.; Hew, K.F.T. Intangible Rewards versus Tangible Rewards in Gamified Online Learning: Which Promotes Student Intrinsic Motivation, Behavioural Engagement, Cognitive Engagement and Learning Performance? Br. J. Educ. Technol. 2024, 55, 297–317. [Google Scholar] [CrossRef]
  32. Laine, T.H.; Lindberg, R.S.N. Designing Engaging Games for Education: A Systematic Literature Review on Game Motivators and Design Principles. IEEE Trans. Learn. Technol. 2020, 13, 804–821. [Google Scholar] [CrossRef]
  33. Anthes, C.; García-Hernández, R.J.; Wiedemann, M.; Kranzlmüller, D. State of the Art of Virtual Reality Technology. In Proceedings of the IEEE Aerospace Conference Proceedings, Big Sky, MT, USA, 5–12 March 2016; IEEE Computer Society: Washington, DC, USA, 2016. [Google Scholar]
  34. Prithul, A.; Adhanom, I.B.; Folmer, E. Teleportation in Virtual Reality; A Mini-Review. Front. Virtual Real. 2021, 2, 730792. [Google Scholar] [CrossRef]
  35. Barszcz, M.; Dziedzic, K.; Skublewska-Paszkowska, M.; Powroznik, P. 3D Scanning Digital Models for Virtual Museums. Comput. Animat. Virtual Worlds 2023, 34, e2154. [Google Scholar] [CrossRef]
  36. Rodriguez-Garcia, B.; Guillen-Sanz, H.; Checa, D.; Bustillo, A. A Systematic Review of Virtual 3D Reconstructions of Cultural Heritage in Immersive Virtual Reality. Multimed. Tools Appl. 2024. [Google Scholar] [CrossRef]
  37. Avola, D.; Cinque, L.; Foresti, G.L.; Marini, M.R. A Novel Low Cybersickness Dynamic Rotation Gain Enhancer Based on Spatial Position and Orientation in Virtual Environments. Virtual Real. 2023, 27, 3191–3209. [Google Scholar] [CrossRef]
  38. Law, E.L.-C.; Brühlmann, F.; Mekler, E.D. Systematic Review and Validation of the Game Experience Questionnaire (GEQ)—Implications for Citation and Reporting Practice. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play, Melbourne, Australia, 28–31 October 2018; ACM: New York, NY, USA, 2018; pp. 257–270. [Google Scholar]
  39. Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence Teleoperators Virtual Environ. 1998, 7, 225–240. [Google Scholar] [CrossRef]
  40. Rebhi, M.; Ben Aissa, M.; Tannoubi, A.; Saidane, M.; Guelmami, N.; Puce, L.; Chen, W.; Chalghaf, N.; Azaiez, F.; Zghibi, M.; et al. Reliability and Validity of the Arabic Version of the Game Experience Questionnaire: Pilot Questionnaire Study. JMIR Form. Res. 2023, 7, e42584. [Google Scholar] [CrossRef]
  41. Caldas, O.I.; Mauledoux, M.; Aviles, O.F.; Rodriguez-Guerrero, C. Breaking Presence in Immersive Virtual Reality toward Behavioral and Emotional Engagement. Comput. Methods Programs Biomed. 2024, 248, 108124. [Google Scholar] [CrossRef]
  42. Tcha-Tokey, K.; Christmann, O.; Loup-Escande, E.; Richir, S.; Richir, S. Proposition and Validation of a Questionnaire to Measure the User Experience in Immersive Virtual Environments. Int. J. Virtual Real. 2016, 16, 33–48. [Google Scholar] [CrossRef]
  43. Slater, M. Place Illusion and Plausibility Can Lead to Realistic Behaviour in Immersive Virtual Environments. Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 3549–3557. [Google Scholar] [CrossRef] [PubMed]
  44. Csikszentmihalyi, M. The Systems Model of Creativity; Springer: Dordrecht, The Netherlands, 2014; ISBN 978-94-017-9084-0. [Google Scholar]
  45. Högberg, J.; Hamari, J.; Wästlund, E. Gameful Experience Questionnaire (GAMEFULQUEST): An Instrument for Measuring the Perceived Gamefulness of System Use. User Model. User-Adapt. Interact. 2019, 29, 619–660. [Google Scholar] [CrossRef]
  46. Checa, D.; Bustillo, A. A Review of Immersive Virtual Reality Serious Games to Enhance Learning and Training. Multimed. Tools Appl. 2020, 79, 5501–5527. [Google Scholar] [CrossRef]
  47. Hennig, C.; Meila, M.; Murtagh, F.; Rocci, R. Handbook of Cluster Analysis; CRC Press: New York, NY, USA, 2015; ISBN 9780429185472. [Google Scholar]
  48. Murtagh, F.; Contreras, P. Algorithms for Hierarchical Clustering: An Overview. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2012, 2, 86–97. [Google Scholar] [CrossRef]
  49. Makransky, G.; Terkildsen, T.S.; Mayer, R.E. Adding Immersive Virtual Reality to a Science Lab Simulation Causes More Presence but Less Learning. Learn. Instr. 2019, 60, 225–236. [Google Scholar] [CrossRef]
  50. Sung, B.; Mergelsberg, E.; Teah, M.; D’Silva, B.; Phau, I. The Effectiveness of a Marketing Virtual Reality Learning Simulation: A Quantitative Survey with Psychophysiological Measures. Br. J. Educ. Technol. 2021, 52, 196–213. [Google Scholar] [CrossRef]
  51. Jiang, J.; Fryer, L.K. The Effect of Virtual Reality Learning on Students’ Motivation: A Scoping Review. J. Comput. Assist. Learn. 2024, 40, 360–373. [Google Scholar] [CrossRef]
  52. Zhao, J.; Wallgrün, J.O.; Sajjadi, P.; LaFemina, P.; Lim, K.Y.T.; Springer, J.P.; Klippel, A. Longitudinal Effects in the Effectiveness of Educational Virtual Field Trips. J. Educ. Comput. Res. 2022, 60, 1008–1034. [Google Scholar] [CrossRef]
  53. Pousada García, T.; Pereira Loureiro, J.; Groba, B.; Nieto-Riveiro, L.; Martín, J.; Lagos Rodríguez, M. User Experience in Virtual Reality from People With and Without Disability. In HCI International 2023 Posters. HCII 2023. Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2023; Volume 1883, pp. 352–357. [Google Scholar]
  54. Rodríguez-Fuentes, G.; Campo-Prieto, P.; Souto, C.; María, J.; Carral, C. Immersive Virtual Reality and Its Influence on Physiological Parameters in Healthy People. Retos 2024, 51, 615–623. [Google Scholar] [CrossRef]
  55. Lochner, D.C.; Gain, J.E. VR Natural Walking in Impossible Spaces. In Proceedings of the 14th ACM SIGGRAPH Conference on Motion, Interaction and Games, Virtual Conference, 10–12 November 2021; ACM: New York, NY, USA, 2021; pp. 1–9. [Google Scholar]
  56. Rodriguez-Garcia, B.; Alaguero, M.; Guillen-Sanz, H.; Miguel-Alonso, I. Comparing the Impact of Low-Cost 360° Cultural Heritage Videos Displayed in 2D Screens Versus Virtual Reality Headsets; Springer: Cham, Switzerland, 2022; Volume 13446, ISBN 9783031155529. [Google Scholar]
  57. Politowski, C.; Petrillo, F.; Gueheneuc, Y.-G. A Survey of Video Game Testing. In Proceedings of the 2021 IEEE/ACM International Conference on Automation of Software Test (AST), Madrid, Spain, 20–21 May 2021; pp. 90–99. [Google Scholar]
Figure 1. Animation of a 3D model unwrapping from a cube: (a) beginning of the unwrapping; (b) advanced status of the unwrapping; and (c) all the faces are almost overlapping and about to finish the unwrapping.
Figure 1. Animation of a 3D model unwrapping from a cube: (a) beginning of the unwrapping; (b) advanced status of the unwrapping; and (c) all the faces are almost overlapping and about to finish the unwrapping.
Electronics 13 01972 g001
Figure 2. Overhead view of the UVR classroom setup: (1) Classroom, (2) Object Table, (3) Unwrapping Table, (4) the Robot, (5) Tablet, (6) the Path of the Unwrapping, and (7) the Blackboard.
Figure 2. Overhead view of the UVR classroom setup: (1) Classroom, (2) Object Table, (3) Unwrapping Table, (4) the Robot, (5) Tablet, (6) the Path of the Unwrapping, and (7) the Blackboard.
Electronics 13 01972 g002
Figure 7. UVR structure overview: levels, difficulty, number of model types to unwrap, and availability of the clue system.
Figure 7. UVR structure overview: levels, difficulty, number of model types to unwrap, and availability of the clue system.
Electronics 13 01972 g007
Figure 8. Users during the experience: (a) PC group and (b) VR group.
Figure 8. Users during the experience: (a) PC group and (b) VR group.
Electronics 13 01972 g008
Figure 9. Composition of hierarchical clusters: Y-axis displays each cluster (1, 2, 3) and X-axis indicates the percentage composition of each cluster. The left side represents the PC and VR group, while the right side represents the Low-XP, Mid-XP, and High-XP groups.
Figure 9. Composition of hierarchical clusters: Y-axis displays each cluster (1, 2, 3) and X-axis indicates the percentage composition of each cluster. The left side represents the PC and VR group, while the right side represents the Low-XP, Mid-XP, and High-XP groups.
Electronics 13 01972 g009
Figure 10. Mean values distribution of the GEQ-IG for groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Figure 10. Mean values distribution of the GEQ-IG for groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Electronics 13 01972 g010
Figure 11. Mean values distribution of the GEQ-PG for groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Figure 11. Mean values distribution of the GEQ-PG for groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Electronics 13 01972 g011
Figure 12. Mean values distribution of the PQ for various groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Figure 12. Mean values distribution of the PQ for various groups: (a) PC and VR, (b) previous experience in VR (deviation percentage based on the Mid-XP group), and (c) Clusters 1, 2, and 3.
Electronics 13 01972 g012
Table 1. Results of the GEQ-IG: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
Table 1. Results of the GEQ-IG: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
ComponentDeviceUser VR Pre-ExperienceClusters
PCVRLow-XPMid-XPHigh-XPCluster 1Cluster 2Cluster 3
Total N2528112913122417
M Competence2.962.573.092.572.881.832.693.50
SD Competence0.811.040.920.861.110.640.780.75
M Immersion2.783.323.272.983.082.922.773.59
SD Immersion0.970.961.120.951.021.020.860.99
M Flow1.942.462.092.142.502.421.832.62
SD Flow1.021.061.061.120.950.881.041.07
M Tension0.541.130.500.970.881.750.330.94
SD Tension0.971.240.801.201.280.990.721.35
M Challenge1.882.002.141.971.731.961.902.00
SD Challenge1.221.161.171.211.150.951.211.33
M Negative Affect0.360.630.320.570.501.210.250.35
SD Negative Affect0.560.910.720.750.860.930.440.73
M Positive Affect3.163.053.322.983.192.423.043.68
SD Positive Affect0.790.900.720.910.800.930.710.53
Table 2. Results of the GEQ-PG: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
Table 2. Results of the GEQ-PG: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
ComponentDeviceUser VR Pre-ExperienceClusters
PCVRLow-XPMid-XPHigh-XPCluster 1Cluster 2Cluster 3
Total N2528112913122417
M Negative Affect0.520.760.540.700.651.390.490.33
SD Negative Affect0.861.080.870.981.131.120.810.84
M Positive Affect2.401.892.482.162.471.921.722.90
SD Positive Affect1.061.231.071.011.130.851.161.03
M Returning0.801.851.121.461.312.390.501.65
SD Returning1.011.381.361.301.361.100.801.43
M Tiredness0.480.960.680.840.541.710.400.53
SD Tiredness0.761.171.041.090.861.080.710.96
Table 3. Results of the PQ: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
Table 3. Results of the PQ: (M) Mean (scale 0 to 4), (SD) Standard deviation for each competence per device group (PC and VR), previous experience in VR (Low-XP, Mid-XP, and High-XP), and Clusters 1, 2, and 3.
ComponentDeviceUser VR Pre-ExperienceClusters
PCVRLow-XPMid-XPHigh-XPCluster 1Cluster 2Cluster 3
Total N2528112913122417
M Control Factors2.332.382.402.292.472.142.172.77
SD Control Factors1.301.101.251.161.240.941.231.21
M Sensory Factors2.092.182.402.012.201.851.922.65
SD Sensory Factors1.411.241.201.321.391.101.311.33
M Distraction Factors1.972.132.192.032.002.061.852.34
SD Distraction Factors1.511.211.391.291.491.121.391.42
M Realism Factors1.831.952.231.811.792.011.502.36
SD Realism Factors1.501.201.241.331.451.131.311.40
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rodriguez-Garcia, B.; Ramírez-Sanz, J.M.; Miguel-Alonso, I.; Bustillo, A. Enhancing Learning of 3D Model Unwrapping through Virtual Reality Serious Game: Design and Usability Validation. Electronics 2024, 13, 1972. https://doi.org/10.3390/electronics13101972

AMA Style

Rodriguez-Garcia B, Ramírez-Sanz JM, Miguel-Alonso I, Bustillo A. Enhancing Learning of 3D Model Unwrapping through Virtual Reality Serious Game: Design and Usability Validation. Electronics. 2024; 13(10):1972. https://doi.org/10.3390/electronics13101972

Chicago/Turabian Style

Rodriguez-Garcia, Bruno, José Miguel Ramírez-Sanz, Ines Miguel-Alonso, and Andres Bustillo. 2024. "Enhancing Learning of 3D Model Unwrapping through Virtual Reality Serious Game: Design and Usability Validation" Electronics 13, no. 10: 1972. https://doi.org/10.3390/electronics13101972

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop