Next Article in Journal
Special Issue on Experimental Investigation and Numerical Modeling of Rock Brittle Failure Behavior under High Stress Conditions
Next Article in Special Issue
Effects of Pseudo-Weight Resistance Training Using Mixed-Reality Technology on Muscle Activation in Healthy Adults: A Preliminary Study
Previous Article in Journal
A Comparative Regression Analysis between Principal Component and Partial Least Squares Methods for Flight Load Calculation
Previous Article in Special Issue
VR-Enhanced Cognitive Learning: Method, Framework, and Application
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Influence of Immersive and Collaborative Virtual Environments in Improving Spatial Skills

1
Graphical Expression Department, Universidad Politécnica de Cartagena, 40506 Cartagena, Spain
2
Department of Product Design, University of Kentucky, Lexington, KY 40506, USA
3
Instituto Universitario de Investigación en Tecnología Centrada en el Ser Humano (HUMAN-tech), Campus de Vera, Universitat Politècnica de Valencia, 46022 Valencia, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(14), 8426; https://doi.org/10.3390/app13148426
Submission received: 18 June 2023 / Revised: 9 July 2023 / Accepted: 14 July 2023 / Published: 21 July 2023
(This article belongs to the Special Issue Applications of Virtual, Augmented, and Mixed Reality - 2nd Volume)

Abstract

:
The use of Virtual Reality (VR) is growing in popularity in educational contexts. In this work, we present a novel software application based on immersive VR in which users can interact simultaneously with a model in a shared virtual scene and maintain audiovisual communication. The 3D model-building activity within the application was designed to improve spatial skills. Twenty-nine industrial engineering students completed the modeling activity in groups of two or three. Post-test scores showed significant improvement over pre-test scores for both spatial tests used to measure the effectiveness of the instrument in improving spatial skills. Participants rated the application favorably in terms of usability and functionality. Results suggest that spatial training in a collaborative immersive virtual environment has the potential to be an effective approach to improving spatial skills.

1. Introduction

Virtual Reality (VR) has grown in popularity as a learning tool over the past decade, including among academics and educators [1]. The improved user experience in terms of interaction, immersion, and quality of information [2], together with the decreasing cost of the technology, have propelled this expansion. Many others have highlighted the advantages of using VR for education, including learner satisfaction [3], improved motivation and problem-solving abilities [4,5], and the advantages of sensory-based learning [6,7,8]. Studies of using VR-based platforms for the acquisition of knowledge are now found across a wide range of disciplines, including medicine [9], rehabilitation [10], industrial training [11], evacuation plans [12], urban planning [13], journalism [14], sports [15], geometry [16], design [17], biology [18], physics [19], and engineering [20].
In the past decade, many researchers have compared students’ learning in VR with learning through other mediums. Some examples include the work of Nadan et al. [21], [22], Slavova and Mu [23], and Zakaria et al. [24]. One area of education that is particularly interesting for VR applications is collaborative learning. Collaborative learning is a pedagogical method in which individuals interact in order to solve problems together [25]. Gokhale [26] is considered to have introduced the concept of collaborative learning in university-level teaching and outlined the advantages of collaborative learning over individual learning.
Virtual collaborative environments are systems that connect multiple users who interact in real-time in a virtual space via an internet connection and can serve as a platform for collaborative learning. Virtual collaborative environments that allow specialists from different disciplines to solve complex problems together are promising venues for research and for the improvement of work and productivity [27]. Activities taking place in virtual collaborative environments may be valuable for enhancing STEM education. Saorín et al. [28] concluded that it is possible to meet the future engineering industry demands through training programs involving collaborative 3D modeling environments. Furthermore, according to Dominguez et al. [29], VR can be a powerful tool for training spatial abilities. While virtual collaborative environments are promising for STEM education, applications utilizing this technology have not yet been made in the area of spatial skills training. Spatial skills training is a popular area of study in STEM education because researchers believe that improving spatial skills can help students to succeed in STEM [30]. Thus, it would be valuable to explore the application of collaborative virtual environments to spatial training.
In the present work, we add to the research on virtual collaborative environments for educational settings by describing the design and validation of an immersive VR application used for a collaborative exercise for the construction of 3D models. We tested the application with 29 students with the goal of finding out whether the students’ spatial skills were improved through using the application, and whether the students found the application to have good usability and functionality. The main contribution of this work is to present evidence that performing a group modeling exercise in a virtual collaborative environment can improve performance on spatial tests. This document is structured such that, following a review of the most relevant literature, we describe the materials and methods of this study, present the results, and finally discuss the results and provide our conclusions.

2. Literature Review

2.1. Spatial Skills and STEM Education

Spatial ability is generally considered to be the ability to visualize and mentally manipulate relationships between objects and space. Spatial tests were originally used in the study of psychology to predict which individuals would be successful in mechanical vocations [31]. Today, spatial testing and spatial training remain very popular as many researchers believe that improving spatial skills can help students to succeed in STEM [30]. A meta-analysis of training studies concluded that including spatial training in education could lead to increased participation in STEM [32]. While spatial training often leads to improved performance on spatial tests, the evidence that spatial training enhances long-term STEM achievement remains outstanding. A review concluded that spatial training alone is unlikely to raise STEM scores substantially, and rigorous studies are still needed in order to establish a causal effect between spatial training and STEM outcomes [33].
The most common way to assess spatial ability is by using psychometric tests, which are considered to measure “fundamental” spatial skills [34,35]. Two spatial tests are used in this study and are discussed in more detail in Section 3.2 of this paper. Existing spatial tests have many limitations. Multiple researchers have pointed out that spatial skill is not a singular skill [31,36]; thus, no single test can be considered to measure spatial skill on its own. Spatial ability is really an umbrella term that describes many different possibly related but distinct skills, and different tests may assess different specific spatial skills.
Spatial tests typically portray 3D shapes as 2D black and white line drawings, and the way some of the shapes are drawn may lead to confusion for test takers and introduce additional difficulty into spatial tests that is not related to the 3D mental manipulation of shapes [37]. For this reason, Bartlett and Camba advocated for using more advanced visualization technologies to present spatial problems [38]. This recommendation is relevant not only for spatial testing, but also for spatial training applications such as the one described in this paper.

2.2. VR and Spatial Skills Training

Virtual reality is an increasingly popular medium for research on training spatial ability [39]. In addition to the visual problem of presenting a 3D task using 2D media, spatial tasks may face issues of engagement and motivation. Using media like VR to present spatial training tasks may help with student motivation, as traditional paper and pencil tasks may be unappealing to students, who might find them boring or unmotivating [40].
In recent years, numerous researchers have been exploring new media like VR and augmented reality (AR) for spatial training tasks. One approach is to train mental rotation skills by having users rotate shapes in a virtual environment [41,42,43]. In another example, visual cues were added to the virtual environment with the goal of training mental rotation skills [44]. In addition to training mental rotation skills specifically, others have attempted to train spatial skills by having students view their own 3D models in VR [45], training the spatial skill of perspective-taking [46], and training general spatial ability [47]. Rather than immersive VR, other researchers have explored using AR to train spatial ability [48,49].
Virtual Reality-based spatial training programs have been demonstrated to be effective, as shown in the following examples. In one study, researchers used exercises in an AR book in an attempt to train spatial relations and spatial visualization skills and used a VR orienteering game to train the skill of spatial orientation [40]. This study used a control group who did not complete the training and found that the gains in performance on spatial tests at the end of the study were greater for the experimental group [40]. In another study, researchers developed a novel VR spatial training application in which students manipulated shapes in a virtual environment by moving, rotating, and scaling the shapes. To validate the tool, the researchers conducted an experiment in which one group of students used a VR headset to participate in the activity, and another group performed the same activity on a computer screen. Spatial test scores improved more for the virtual reality group than for the computer screen group [50]. These studies suggest that VR spatial training is effective, and potentially more so than computer-based spatial training media.
Following the study published by Bekele and Campion [51], it is important to differentiate between VR observed on a screen and immersive VR [52], in which the user is completely visually immersed in virtual space. According to Loeffer and Anderson [53], immersive VR allows the user to interact in a 3D computer-generated environment, and immersive VR is what has drawn the most interest of educators and educational researchers [54,55]. Oh and Nussli [56] argued that the potential of this technology for the classroom seems limitless. The works presented by Witmer et al. [57], Bliss et al. [58], and Koh et al. [59] conclude that immersive virtual environments are as effective as real-world environments for training spatial ability. Levels of immersion may be especially relevant to training spatial skills, as a study found that different levels of complexity of spatial learning may be better learned at different levels of immersion [60].
Other researchers have assessed VR-based spatial training instruments on the basis of usability and enjoyability. One study found that the right combination of levels of immersion and controls could lead to the most positive experience for the user in a VR-based mental rotation training task [43]. Another study assessed a VR-based training application for mental rotation. In this case, the researchers compared conditions in which the users could actively manipulate the objects versus passively observing the motion. Users rated the active condition higher rated for playfulness, ease of use, and use-intention score [39]. Thus, the design of spatial training applications is important not only for user comfort and enjoyment, but potentially also for motivation and training effectiveness.

2.3. Multi-User Collaborative Applications

In our review of the literature, the earliest platform we found for the development of multi-user VR applications was described by Christer [61]. The DIVE platform permits a high number of users to participate and interact in the same virtual environment. Since then, the development of scalable architectures that can be adapted to virtual environments has been a great challenge in spite of the advances in computing [62]. However, such applications are now being developed. For example, Rai et al. [63] presented a prototype of a networked virtual environment that centered on the combination of a Minority Game [64] and an Online Induction Algorithm with lossless data transfer functions. Furthermore, many popular games on consumer VR headsets, such as the Meta Quest 2, allow users to play together remotely over the internet with their avatars inhabiting the same virtual space and the capability of audio communication. Two examples of such games include “Rec Room” and “Walkabout Mini Golf.”
Beyond entertainment, virtual collaborative environments have the potential to enhance skills training and productivity [27]. User experience is also highly important in virtual collaborative environments [65]. Although other researchers have created spatial skills training applications using immersive VR, as described in Section 2.2, we were not able to find any examples of such applications which leveraged a multi-user collaborative environment for this purpose. Thus, we consider our application to be novel in the sense that it leverages a virtual collaborative environment for a spatial skills training activity.

3. Materials and Methods

Our investigation centers around an experimental analysis that compares students’ scores on spatial tests before and after performing an activity in which they construct geometric solids in a virtual immersive collaborative environment. We worked with an initial group of 41 student volunteers from a first-year industrial engineering course. As these students had completed technical coursework in high school, they were expected to have acquired some skills in spatial visualization through their past training in technical drawing.
The students’ baseline spatial skills were measured with two well-known tests, which were administered online on the students’ own time. Twelve students were removed from the study at this point since they did not complete the tests. The twenty-nine remaining students were divided randomly into nine groups of three students, plus one group of two students, for the completion of the collaborative virtual activity. Twelve of the twenty-nine participants were women, and seventeen were men.
The virtual activity was carried out with a software application we developed specifically for this study, which centered on the creation of a geometric model constructed from primitive solids. The students participated simultaneously with their group members in the virtual space and interacted together to build the model. Since this was a new software application and the students did not know how to use it, before each group of students began the modeling process, they first built a model in a shared virtual space with an instructor. The instructor helped the students to understand the functionality of the software and the modeling method.
At the end of the modeling activity, the students again took the two spatial tests under the same conditions as they had conducted previously. By comparing their performance on the pre- and post-tests, we draw some conclusions about the activity’s ability to improve spatial skills. In order to assess the design of the novel software we created for this study, the students also completed two questionnaires about the usability and functionality of the software after they finished the modeling activity.

3.1. Virtual Reality Experience

3.1.1. Hardware

The application used for the virtual experience was built on a HP laptop running 64-bit Windows with an Intel® Core™ i7—8750H processor running at 2 GHz with 8GB RAM, and an Nvidia GeForce GTX 1080 Ti graphics card. The devices used for the VR experience were HP Reverb head-mounted displays with a screen resolution of 2160 × 2160 and two handheld controllers for user interaction.

3.1.2. Software

The virtual experience we created for this study is a collaborative VR application based on Unity. The central activity within the experience is the construction of a model from geometric solids. The virtual environment is shown in Figure 1.
In addition to the constructed model, the virtual environment also displays a screen that contains a visualization of a projection drawing of the model, and a menu where the user can interact with the application. For the construction of the larger model, the user can create four different primitive solids in the virtual space. The primitives are positioned in front of the user by clicking on the menu icons shown in Figure 2. The primitives are called ‘cube,’ ‘cradle,’ ‘rounding,’ and ‘hole.’
In addition to the creation of primitive shapes, the menu also allows the user to carry out two other operations: delete a previously created primitive shape and select the direction in which the shapes can be scaled.
The remaining menu options allow for the configuration of the environment. The “open” button allows an image file to be uploaded and displayed on the screen. The buttons “up” and “down” adjust the camera view within the environment to be better suited for the user’s height. The “freeze menu” button allows the user to anchor the menu in one spot, since, by default, the menu moves with the user’s point of view.
The created primitives can be displaced, rotated, and scaled by the user through natural gestures performed on the controllers. Figure 3 shows the operation in which the user grabs hold of the primitive with a controller by positioning the curse of a controller over the primitive and pressing the trigger button, then displaces or moves the shape with a simple movement of the controller. Releasing the trigger button releases the primitive.
In order to facilitate the processes of aligning the primitives to construct the larger model, when the face of one primitive is positioned at a close, experimentally determined distance to the face of another primitive that already exists in the model, it will be positioned automatically such that both primitives are perfectly aligned. In a similar manner, to perform the rotation of a primitive, the user can capture that primitive and turn the controller in the desired direction. The primitive continues the rotation that the user performed with the controller until the trigger button is released (see Figure 4).
In the same way as in the displacement, to facilitate the precise alignment of primitives, at the end of a rotation movement, if one of the rotations of the primitives differs in the X, Y, or Z rotation by a small, experimentally determined angle from the position of another primitive with said axis, the rotated primitive will automatically reorient so that both primitives are aligned.
The operation of scaling the primitive is initiated by capturing a primitive with both controllers. Once the primitive is captured by both controllers, the scaling up or scaling down of the primitive is controlled by the change in the distance between the two controllers (see Figure 5).
Just as in the other operations, to facilitate the construction of the model, once the scaling of one primitive is finished, the driving dimension is rounded to the nearest integer, so that the final sizes of each primitive can easily be aligned. The scaling factor will be applied to the X, Y, or Z axis depending on which button, “X-axis”, “Y-axis”, or “Z-axis”, was selected from the menu.

3.1.3. Collaborative Environment

The software was designed to be used in a collaborative manner, meaning that multiple users can simultaneously be present in the same virtual environment and participate in the construction of the same model together. Thus, the application allows for the coordination of the actions performed by different users. The users can manipulate any of the primitives at any time when the primitive is not being held by another user. If another user is holding the primitive, the primitive appears transparent to the control cursor of the rest of the users until the user who is holding the primitive lets go.
The users who are in the same virtual environment are visible to the rest of the users in the form of simplified avatars (see Figure 6). The users’ names hover above their avatars’ heads. During the entire activity, the users can communicate verbally.
Finally, given that a connection problem may occur since the users are connected remotely, the menu shown in Figure 2 initially shows a “without connection” message which is normally hidden but can be shown in the event of a connection problem.

3.1.4. Modeling Activity

The collaborative activity during the VR experience consisted of modeling six different models constructed of primitive solids. The models are shown in Figure 7. Each individual member of the collaborating group was physically occupying a different office while wearing the VR headset, and they were connected to the same virtual environment through a local network. Participants were allotted two-hour morning sessions separated by 20 min breaks. Although there was no time limit to complete the modeling exercises, the majority of the groups completed them in two sessions. Only three of the ten groups required a third evening session.
Given that none of the students had previous experience using VR, before beginning the activity, they performed a practice exercise modeling the shape shown on the left of Figure 7. This exercise was led by an instructor, who explained to the students the functionality of the software application and how to complete the modeling process. The time it took to model this example with the instructor varied among the groups depending on the questions asked by the students, but it generally lasted 40 to 60 min.

3.2. Tests of Spatial Visualization

To assess the students’ spatial ability before and after the virtual experience, we used the Mental Rotation Test (MRT) proposed by Vandenberg and Kuse [66] and the Differential Aptitude Test: Spatial Relations (DAT:SR) [67]. The MRT is a test that measures an individual’s ability to mentally rotate 3D objects. The test consists of a total of 20 questions divided into two sections of 10 questions each. For each question, one must select two correct answers out of a total of four options. The two correct answers represent different rotated views of the shape in question. Test-takers are allowed six minutes to complete the test, three for each section, with a pause in between. The MRT was scored out of a total of 40 points, one point for each correctly identified shape. An example question from the MRT is shown in Figure 8.
Rather than mental rotation, the DAT:SR is considered to assess the skills of spatial visualization and spatial relations [35]. The DAT:SR consists of 50 questions. The left side shows a flattened pattern with some shaded parts which represents the transformed geometry of one of the four 3D shapes shown on the right. The test taker must select the shape from the right that corresponds to the flattened pattern. Twenty minutes are allotted to complete the test, and the test is scored out of 50 points, with one point for each correct answer. An example question from the DAT:SR is shown in Figure 9.
To facilitate the administration and scoring of the tests, the tests were administered digitally using Microsoft Forms.

3.3. Tests of Usability and Functionality of the VR Application

We used two questionnaires to assess the usability and functionality of the VR application.

3.3.1. Usability Assessment

The usability assessment used in this study is based on the System Usability Scale (SUS) [68], a tool designed to assess the usability of the user experience of any device, application, or product. The instrument involves answering a series of Likert-type questions, based on a 1–5 scale. An answer of 1 indicates that the user strongly disagrees with the statement, and 5 indicates that they strongly agree. The questions used in our usability assessment are shown in Table 1.

3.3.2. Functionality Assessment

Similarly to the usability assessment, the functionality assessment involved answering a series of Likert-type questions based on a 1–5 scale, where 1 indicates that the user strongly disagrees with the statement, and 5 indicates that they strongly agree. The questions used in our assessment are shown in Table 2.

4. Results

This study began with the completion of the MRT and DAT as pre-tests of spatial skills and concluded with a post-test using the same instruments. Table 3 shows the results obtained from the 29 students that participated. The distribution of the MRT pre-test and post-test data was normal and without outliers. The distribution of the DAT pre-test and post-test data was bimodal and without outliers. Following the recommendation of Poncet et al., (2016) [69], we proceeded with the paired t-test analysis for both tests.
The mean score on the MRT improved by 34.0%, and the mean score on the DAT:SR improved by 1.6%. Thus, the improvement on the MRT score was higher than the improvement on the DAT:SR score, but performance improved on both tests.
A paired samples t-test was used to compare performance on the MRT pre-test and post-test. There was a significant improvement in the MRT post-test scores (M = 25.79, SD = 7.97) over the MRT pre-test scores (M = 19.24, SD = 9.28); t(28) = 5.94, p < 0.001. The effect size was large, with a Cohen’s d of 1.10, 95% CI [0.63, 1.56].
A paired samples t-test was also used to compare performance on the DAT:SR pre-test and post-test. There was a significant improvement in the DAT:SR post-test scores (M = 33.07, SD = 11.32) over the scores on the DAT:SR pre-test (M = 32.52, SD = 10.86); t(28) = 4.33, p < 0.001. The effect size was large, with a Cohen’s d of 0.80, 95% CI [0.37, 1.19].
A Pearson correlation coefficient was computed to assess the linear relationship between the MRT pre-test score and the DAT:SR pre-test score. There was not a significant correlation between the two scores, r(28) = 0.34, p = 0.700. Figure 10 shows a scatterplot of the pre-test scores on the MRT and DAT:SR, demonstrating visually this lack of significant correlation.
A Pearson correlation coefficient was also computed to assess the linear relationship between the MRT post-test score and the DAT:SR post-test score. There was a significant positive correlation between the two post-test scores, r(28) = 0.84, p < 0.001. Figure 11 shows a scatterplot of the pre-test scores on the MRT and DAT:SR, demonstrating visually this positive correlation.
We will now report the results obtained on the questionnaires used to validate the usability and functionality of the VR application developed for use in this study. Following Lewis (1993) [70], we chose to use the mean to characterize the Likert scale data. The average score for each question on the usability assessment is reported in Table 4.
As shown in Table 4, all the positive questions (even-numbered questions) received average scores above 4, and all the negative questions (odd-numbered questions) received scores below 1.5, indicating that the students found the application to have good usability. Table 5 shows the results of the functionality assessment.
All users gave the highest possible rating to the questions of ease of navigating in the scene, creating the 3D primitives, drawing in 3D, and deleting the created primitives. There were also two questions rated with the lowest possible score by all users, which referred to the duplication of existing objects (Q9) and undoing or redoing previous actions (Q10). These functionalities were not implemented in the current version of our application, hence the low scores.

5. Discussion

Student performance on both the MRT and DAT:SR improved significantly from the pre-test to the post-test. This suggests that the 3D modeling activity the students worked on in our application may have had a positive impact on the spatial skills which are measured by these instruments. What is not as clear is why the correlation between scores on the MRT and DAT:SR was only significant in the post-test case. This could be due to the greater overall improvement in MRT scores compared to DAT:SR scores. Others have similarly reported that the DAT:SR yields relatively high pre-test scores, and that post-training gains are not as high on this test as on other spatial tests, possibly due to a ceiling effect [35].
Other possible explanations could be a practice effect that the students better understood the MRT test the second time around, or that the students put more effort into the assessments after having completed the in-person portion of this study. While the MRT is an extremely popular and widely used instrument, it has been subject to criticism regarding its accuracy and fairness [71]. Test–retest reliability is not generally demonstrated with the MRT, and even one additional test session tends to lead to improvements in scores [72]. However, this study is not unique in using the same spatial instruments as pre- and post-tests; this is a common practice across most spatial training studies. More research is needed to validate our 3D modeling activity with more students, but the initial results are promising with regard to its ability to train spatial skills.
The second portion of our study focused on the usability and functionality of our application. The usability was rated consistently high, suggesting that the participants found the application intuitive to use. The functionality was also rated highly, except for the questions of two functions that were not yet implemented into our application. These findings suggest that the interactions proposed in this activity, including the manipulation of 3D primitives to build the model and the collaboration between users, may be useful for future applications involving 3D modeling in a virtual collaborative space.
The virtual collaborative aspects of the application appeared to lead to a positive experience for participants. Gathering together in the virtual space seemed to be a fun novelty for the participants, introducing something more exciting than collaborating in real physical space or working alone in VR. Student enjoyment of our application is quite important, since student interest in VR is viewed as one of the primary drivers of its adoption in education [73]. Since we observed each group’s shift in the modeling approach, which started off working one by one in the earlier models and dividing the task, and working simultaneously on different pieces of the model later on, it appears that the opportunity for collaboration influenced people’s approach to the modeling task. It is possible that in the beginning, the participants needed to obtain feedback or reassurance from one another regarding how to use the application and how to best construct the model, but as they built confidence, they moved toward working independently.

Limitations and Future Work

One limitation of this study is the small sample size. Conducting this study was quite time-intensive, as the students had to spend many hours participating in the modeling activity, which hindered our ability to obtain a larger number of participants. Another limitation is that the students all had an academic background that is already associated with higher performance on spatial assessments. In the future, this application could be tested with students from other academic backgrounds.
To address the possibility of practice effects, this application could be tested in a context with a control group that either does not participate in any spatial training or who participates in a different type of spatial training activity. Such a study design for a follow-up study could help remove the possibility of improvements in scores being related, in part or in full, to practice effects. Future work could also use a different approach to validate the training, such as by incorporating the pre- and post-spatial testing into the spatial training instrument itself. This may be advantageous, as VR may also be a more accurate medium to use to assess spatial skills in comparison to traditional tests [74].
It appears that the virtual collaborative environment made the modeling task more engaging for the students than it would have been to perform the same task alone in VR. Future studies could use our tool in both conditions, alone or collaboratively, to assess the specific impact of collaboration on the effectiveness of the tool.
Regarding the usability of the application, future work could incorporate an embedded tutorial within the application rather than having a live instruction session at the start of the study. This could reduce the amount of work on the instructor’s part and facilitate the use of the tool with more groups of students in the future.

6. Conclusions

As has been shown in numerous scientific publications, the use of VR in education is becoming more widespread and supported by educators. Within this category, immersive VR appears to be a potential tool that can foster students’ development in certain aptitudes and abilities. In this work, we have focused on the development of spatial relation skills and spatial perception skills, which are considered by many to be important for students who want to pursue STEM studies. To find out the degree to which immersive VR can help students develop these skills, we developed an application with a simple navigation interface and a reduced number of commands.
We investigated two aspects of the application. First, we studied the application’s potential to improve spatial abilities. The post-test scores on the MRT and DAT:SR showed significant improvement over the pre-test scores, suggesting that the application may have the potential to improve the spatial skills assessed by these two tests. Second, we studied the application’s usability and functionality by having the users answer questionnaires. From the results of the evaluations of the usability and functionality of the application, we can conclude that the tool is simple enough and comfortable enough to use so as not to detract from the building of skills. These findings can be helpful for future developers of collaborative VR applications for 3D modeling who may be able to use similar interfaces and interactions to what we presented here.
Our application leverages a collaborative environment that permits users to interact simultaneously with the project and maintain audiovisual communication. While many studies have used VR for spatial skills training, training in a collaborative virtual environment is a more novel approach. Our findings suggest that collaborative virtual environments may be advantageous for the training of spatial skills.

Author Contributions

Conceptualization, M.C. and J.C.; methodology, M.C.; software, J.C.; validation, J.C., F.J.M. and F.N.; formal analysis, J.C. and K.A.B.; investigation, J.C., F.J.M. and F.N.; resources, J.C.; data curation, K.A.B.; writing—original draft preparation, J.C., M.C. and K.A.B.; writing—review and editing, M.C. and K.A.B.; visualization, J.C. and F.J.M.; supervision, M.C. and J.C.; project administration, J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available on request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pellas, N.; Kazanidis, I.; Palaigeorgiou, G. A Systematic Literature Review of Mixed Reality Environments in K-12 Education. Educ. Inf. Technol. 2020, 25, 2481–2520. [Google Scholar] [CrossRef]
  2. Heim, M. Virtual Realism; Oxford University Press: Oxford, UK, 1998. [Google Scholar]
  3. Leonard, S.N.; Fitzgerald, R.N. Holographic Learning: A Mixed Reality Trial of Microsoft HoloLens in an Australian Secondary School. Res. Learn. Technol. 2018, 26, 2160. [Google Scholar] [CrossRef] [Green Version]
  4. Chang, C.-W.; Lee, J.-H.; Wang, C.-Y.; Chen, G.-D. Improving the Authentic Learning Experience by Integrating Robots into the Mixed-Reality Environment. Comput. Educ. 2010, 55, 1572–1578. [Google Scholar] [CrossRef]
  5. Mateu, J.; Alamán Roldán, X. CUBICA: An Example of Mixed Reality. J. Univers. Comput. Sci. 2013, 19, 2598–2616. [Google Scholar] [CrossRef]
  6. Chao, J.; Chiu, J.L.; DeJaegher, C.J.; Pan, E.A. Sensor-Augmented Virtual Labs: Using Physical Interactions with Science Simulations to Promote Understanding of Gas Behavior. J. Sci. Educ. Technol. 2016, 25, 16–33. [Google Scholar] [CrossRef]
  7. Enyedy, N.; Danish, J.A.; DeLiema, D. Constructing Liminal Blends in a Collaborative Augmented-Reality Learning Environment. Int. J. Comput.-Support. Collab. Learn. 2015, 10, 7–34. [Google Scholar] [CrossRef]
  8. Han, J.; Jo, M.; Hyun, E.; So, H.-J. Examining Young Children’s Perception toward Augmented Reality-Infused Dramatic Play. Educ. Technol. Res. Dev. 2015, 63, 455–474. [Google Scholar] [CrossRef]
  9. Huber, T.; Paschold, M.; Hansen, C.; Wunderling, T.; Lang, H.; Kneist, W. New Dimensions in Surgical Training: Immersive Virtual Reality Laparoscopic Simulation Exhilarates Surgical Staff. Surg. Endosc. 2017, 31, 4472–4477. [Google Scholar] [CrossRef]
  10. Keefe, F.; Huling, D.A.; Coggins, M.J.; Keefe, D.F.; Zachary Rosenthal, M.; Herr, N.R.; Hoffman, H.G. Virtual reality for persistent pain: A new direction for behavioral pain management. Pain 2012, 153, 2163–2166. [Google Scholar] [CrossRef] [Green Version]
  11. Matsas, E.; Vosniakos, G.-C. Design of a Virtual Reality Training System for Human–Robot Collaboration in Manufacturing Tasks. Int. J. Interact. Des. Manuf. (IJIDeM) 2017, 11, 139–153. [Google Scholar] [CrossRef]
  12. Mól, A.C.A.; Jorge, C.A.F.; Couto, P.M. Using a Game Engine for VR Simulations in Evacuation Planning. IEEE Comput. Graph. Appl. 2008, 28, 6–12. [Google Scholar] [CrossRef] [PubMed]
  13. Zhang, S.; Moore, A. The Usability of Online Geographic Virtual Reality for Urban Planning. In Innovations in 3D Geo-Information Sciences; Lecture Notes in Geoinformation and Cartography; Isikdag, U., Ed.; Springer: Cham, Switzerland, 2014. [Google Scholar] [CrossRef]
  14. De la Peña, N.; Weil, P.; Llobera, J.; Spanlang, B.; Friedman, D.; Sanchez-Vives, M.V.; Slater, M. Immersive Journalism: Immersive Virtual Reality for the First-Person Experience of News. Presence 2010, 19, 291–301. [Google Scholar] [CrossRef] [Green Version]
  15. Staurset, E.; Prasolova-Førland, E. Creating a Smart Virtual Reality Simulator for Sports Training and Education. Smart Innov. Syst. Technol. 2016, 59, 423–433. [Google Scholar] [CrossRef]
  16. Lai, C.; McMahan, R.P.; Kitagawa, M.; Connolly, I. Geometry Explorer: Facilitating Geometry Education with Virtual Reality; Springer International Publishing: Cham, Switzerland, 2016; pp. 702–713. [Google Scholar]
  17. Ştefan, L. Immersive Collaborative Environments for Teaching and Learning Traditional Design. Procedia-Soc. Behav. Sci. 2012, 51, 1056–1060. [Google Scholar] [CrossRef] [Green Version]
  18. Lee, E.A.-L.; Wong, K.W. Learning with Desktop Virtual Reality: Low Spatial Ability Learners Are More Positively Affected. Comput. Educ. 2014, 79, 49–58. [Google Scholar] [CrossRef] [Green Version]
  19. Van Der Linden, A.; Van Joolingen, W. A Serious Game for Interactive Teaching of Newton’s Laws. In Proceedings of the 3rd Asia-Europe Symposium on Simulation & Serious Gaming—VRCAI ’16, New York, NY, USA, 3–4 December 2016; pp. 165–167. [Google Scholar] [CrossRef]
  20. Abulrub, A.-H.G.; Attridge, A.N.; Williams, M.A. Virtual Reality in Engineering Education: The Future of Creative Learning. In Proceedings of the 2011 IEEE Global Engineering Education Conference (EDUCON), Amman, Jordan, 4–6 April 2011; pp. 751–757. [Google Scholar] [CrossRef] [Green Version]
  21. Nadan, T.; Alexandrov, V.; Jamieson, R.; Watson, K. Is Virtual Reality a Memorable Experience in an Educational Context? Int. J. Emerg. Technol. Learn. (iJET) 2011, 6, 53–57. [Google Scholar] [CrossRef] [Green Version]
  22. Martin-Dorta, N.; Sanchez-Berriel, I.; Bravo, M.; Hernandez, J.; Saorin, J.L.; Contero, M. Virtual Blocks: A Serious Game for Spatial Ability Improvement on Mobile Devices. Multimed. Tools Appl. 2014, 73, 1575–1595. [Google Scholar] [CrossRef]
  23. Slavova, Y.; Mu, M. A Comparative Study of the Learning Outcomes and Experience of VR in Education. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18 March 2018; pp. 685–686. [Google Scholar]
  24. Zakaria, M.; Abuhassna, H.; Ravindaran, K. Virtual Reality Acceptance in Classrooms: A Case Study in Teaching Science. Int. J. Adv. Trends Comput. Sci. Eng. 2020, 9, 1280–1294. [Google Scholar] [CrossRef]
  25. Kyndt, E.; Raes, E.; Lismont, B.; Timmers, F.; Cascallar, E.; Dochy, F. A Meta-Analysis of the Effects of Face-to-Face Cooperative Learning. Do Recent Studies Falsify or Verify Earlier Findings? Educ. Res. Rev. 2013, 10, 133–149. [Google Scholar] [CrossRef]
  26. Gokhale, A.A. Collaborative Learning Enhances Critical Thinking. J. Technol. Educ. 1995, 7, 22–30. [Google Scholar] [CrossRef]
  27. Havenith, H.-B.; Cerfontaine, P.; Mreyen, A.-S. How Virtual Reality Can Help Visualise and Assess Geohazards. Int. J. Digit. Earth 2019, 12, 173–189. [Google Scholar] [CrossRef]
  28. Saorín, J.L.; de la Torre-Cantero, J.; Melián Díaz, D.; López-Chao, V. Cloud-Based Collaborative 3D Modeling to Train Engineers for the Industry 4.0. Appl. Sci. 2019, 9, 4559. [Google Scholar] [CrossRef] [Green Version]
  29. Dominguez, M.G.; Martin-Gutierrez, J.; Roca, C. Tools, Methodologies and Motivation to Improve Spatial Skill on Engineering Students. In Proceedings of the ASEE Annual Conference & Exposition, Atlanta, Georgia, 23–26 June 2013. [Google Scholar]
  30. Sorby, S.A.; Veurink, N.; Streiner, S. Does Spatial Skills Instruction Improve STEM Outcomes? The Answer Is ‘Yes.’. Learn. Individ. Differ. 2018, 67, 209–222. [Google Scholar] [CrossRef]
  31. Hegarty, M.; Waller, D.A. Individual Differences in Spatial Abilities. In The Cambridge Handbook of Visuospatial Thinking; Shah, P., Miyake, A., Eds.; Cambridge University Press: Cambridge, UK, 2005; pp. 121–169. [Google Scholar]
  32. Uttal, D.H.; Meadow, N.G.; Tipton, E.; Hand, L.L.; Alden, A.R.; Warren, C.; Newcombe, N.S. The Malleability of Spatial Skills: A Meta-Analysis of Training Studies. Psychol. Bull. 2013, 139, 352–402. [Google Scholar] [CrossRef] [Green Version]
  33. Stieff, M.; Uttal, D. How Much Can Spatial Training Improve STEM Achievement? Educ. Psychol. Rev. 2015, 27, 607–615. [Google Scholar] [CrossRef]
  34. Atit, K.; Uttal, D.H.; Stieff, M. Situating Space: Using a Discipline-Focused Lens to Examine Spatial Thinking Skills. Cogn. Res. 2020, 5, 19. [Google Scholar] [CrossRef] [Green Version]
  35. Gorska, R.; Sorby, S.A. Testing Instruments for the Assessment of 3D Spatial Skills. In Proceedings of the 2008 ASEE Annual Conference & Exposition, Pittsburgh, PA, USA, 22 June 2008; pp. 13.1196.1–13.1196.10. [Google Scholar]
  36. Carroll, J.B. Human Cognitive Abilities: A Survey of Factor-Analytic Studies; Cambridge University Press: Cambridge, UK, 1993. [Google Scholar]
  37. Bartlett, K.A.; Camba, J.D. The Role of a Graphical Interpretation Factor in the Assessment of Spatial Visualization: A Critical Analysis. Spat. Cogn. Comput. 2023, 23, 1–30. [Google Scholar] [CrossRef]
  38. Bartlett, K.A.; Camba, J.D. Isometric Projection as a Threat to Validity in the PSVT:R. In Proceedings of the 2022 ASEE Annual Conference & Exposition, Minneapolis, MN, 26 June 2022. [Google Scholar]
  39. Lin, P.-H.; Yeh, S.-C. How Motion-Control Influences a VR-Supported Technology for Mental Rotation Learning: From the Perspectives of Playfulness, Gender Difference and Technology Acceptance Model. Int. J. Hum. –Comput. Interact. 2019, 35, 1736–1746. [Google Scholar] [CrossRef]
  40. Roca-González, C.; Martin Gutierrez, J.; GarcÍa-Dominguez, M.; Mato Carrodeguas, M.D.C. Virtual Technologies to Develop Visual-Spatial Ability in Engineering Students. Eurasia J. Math. Sci. Technol. Educ. 2016, 13. [Google Scholar] [CrossRef]
  41. Ariali, S. Training of Mental Rotation Ability in Virtual Spaces. J. Tech. Educ. 2020, 8, 46–63. [Google Scholar]
  42. Ariali, S.; Zinn, B. Adaptive Training of the Mental Rotation Ability in an Immersive Virtual Environment. Int. J. Emerg. Technol. Learn. (iJET) 2021, 16, 20–39. [Google Scholar] [CrossRef]
  43. Chang, C.-W.; Heo, J.; Yeh, S.-C.; Han, H.-Y.; Li, M. The Effects of Immersion and Interactivity on College Students’ Acceptance of a Novel VR-Supported Educational Technology for Mental Rotation. IEEE Access 2018, 6, 66590–66599. [Google Scholar] [CrossRef]
  44. Chen, Q.; Deng, L.; Xu, T.; Zhou, Y. Visualized Cues for Enhancing Spatial Ability Training in Virtual Reality. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022; pp. 299–300. [Google Scholar] [CrossRef]
  45. González, N.A.A. Development of Spatial Skills with Virtual Reality and Augmented Reality. Int. J. Interact. Des. Manuf. 2018, 12, 133–144. [Google Scholar] [CrossRef]
  46. Chang, J.S.-K.; Yeboah, G.; Doucette, A.; Clifton, P.; Nitsche, M.; Welsh, T.; Mazalek, A. Evaluating the Effect of Tangible Virtual Reality on Spatial Perspective Taking Ability. In Proceedings of the 5th Symposium on Spatial User Interaction, Brighton, UK, 16–17 October 2017; pp. 68–77. [Google Scholar]
  47. Hong, J.-C.; Hwang, M.-Y.; Tai, K.-H.; Tsai, C.-R. Training Spatial Ability Through Virtual Reality. In Proceedings of the 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), Wollongong, NSW, Australia, 4–7 December 2018; pp. 1204–1205. [Google Scholar]
  48. Kim, J.; Irizarry, J. Evaluating the Use of Augmented Reality Technology to Improve Construction Management Student’s Spat. Int. J. Constr. Educ. Res. 2021, 17, 99–116. [Google Scholar]
  49. Martín-Gutiérrez, J.; Luís Saorín, J.; Contero, M.; Alcañiz, M.; Pérez-López, D.C.; Ortega, M. Design and Validation of an Augmented Book for Spatial Abilities Development in Engineering Students. Comput. Graph. 2010, 34, 77–91. [Google Scholar] [CrossRef]
  50. Molina-Carmona, R.; Pertegal-Felices, M.; Jimeno-Morenilla, A.; Mora-Mora, H. Virtual Reality Learning Activities for Multimedia Students to Enhance Spatial Ability. Sustainability 2018, 10, 1074. [Google Scholar] [CrossRef] [Green Version]
  51. Bekele, M.K.; Champion, E. A Comparison of Immersive Realities and Interaction Methods: Cultural Learning in Virtual Heritage. Front. Robot. AI 2019, 6, 91. [Google Scholar] [CrossRef] [Green Version]
  52. Stock, C.; Bishop, I.D.; O’Connor, A.N.; Chen, T.; Pettit, C.J.; Aurambout, J.-P. SIEVE: Collaborative Decision-Making in an Immersive Online Environment. Cartogr. Geogr. Inf. Sci. 2008, 35, 133–144. [Google Scholar] [CrossRef]
  53. Loeffler, C.; Anderson, T. The Virtual Reality Casebook; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 1994; ISBN 0-442-01776-6. [Google Scholar]
  54. Mayrath, M.C.; Traphagan, T.; Heikes, E.J.; Trivedi, A. Instructional Design Best Practices for Second Life: A Case Study from a College-Level English Course. Interact. Learn. Environ. 2011, 19, 125–142. [Google Scholar] [CrossRef]
  55. Johannesen, M. The Role of Virtual Learning Environments in a Primary School Context: An Analysis of Inscription of Assessment Practices. Br. J. Educ. Technol. 2013, 44, 302–313. [Google Scholar] [CrossRef]
  56. Oh, K.; Nussli, N. Teacher Training in the Use of a Three-Dimensional Immersive Virtual World: Building Understanding through First-Hand Experiences. J. Teach. Learn. Technol. 2014, 33–58. [Google Scholar] [CrossRef] [Green Version]
  57. Witmer, B.G.; Bailey, J.H.; Knerr, B.W.; Abel, K. Training Dismounted Soldiers in Virtual Environments: Route Learning and Transfer; U.S Army Research Institute for the Behavioral and Social Sciences: Alexandria, VA, USA, 1995. [Google Scholar]
  58. Bliss, J.P.; Tidwell, P.D.; Guest, M.A. The Effectiveness of Virtual Reality for Administering Spatial Navigation Training to Firefighters. Presence Teleoperators Virtual Environ. 1997, 6, 73–86. [Google Scholar] [CrossRef]
  59. Koh, G.; von Wiegand, T.E.; Garnett, R.L.; Durlach, N.I.; Shinn-Cunningham, B. Use of Virtual Environments for Acquiring Configurational Knowledge about Specific Real-World Spaces: I. Preliminary Experiment. Presence 1999, 8, 632–656. [Google Scholar] [CrossRef]
  60. Pollard, K.A.; Oiknine, A.H.; Files, B.T.; Sinatra, A.M.; Patton, D.; Ericson, M.; Thomas, J.; Khooshabeh, P. Level of Immersion Affects Spatial Learning in Virtual Environments: Results of a Three-Condition within-Subjects Study with Long Intersession Intervals. Virtual Real. 2020, 24, 783–796. [Google Scholar] [CrossRef]
  61. Carlsson, C.; Hagsand, O. DIVE—A Platform for Multi-User Virtual Environments. Comput. Graph. 1993, 17, 663–669. [Google Scholar] [CrossRef]
  62. Valadares, A.; Debeauvais, T.; Lopes, C.V. Evolution of scalability with synchronized state in virtual environments. In Proceedings of the 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE 2012) Proceedings, Munich, Germany, 8–9 October 2012. [Google Scholar] [CrossRef]
  63. Rai, A.; Kannan, R.J.; Ramanathan, S. Multi-user networked framework for virtual reality platform. In Proceedings of the 14th IEEE Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 8–11 January 2017. [Google Scholar] [CrossRef]
  64. Yeung, C.; Zhang, Y.C. Minority Games. In Encyclopedia of Complexity and Systems Science; Meyers, R., Ed.; Springer: New York, NY, USA, 2009. [Google Scholar] [CrossRef]
  65. Valadares, A.; Gabrielova, E.; Lopes, C.V. On Designing and Testing Distributed Virtual Environments. Concurr. Comput. Pract. Exp. 2016, 28, 3291–3312. [Google Scholar] [CrossRef] [Green Version]
  66. Vandenberg, S.G.; Kuse, A.R. Mental Rotations, a Group Test of Three-Dimensional Spatial Visualization. Percept. Mot. Ski. 1978, 47, 599–604. [Google Scholar] [CrossRef]
  67. Bennet, G.K.; Seashore, H.G.; Wesman, A.G. Differential Aptitude Test, 5th ed.; Psychological Corporation: San Antonio, TX, USA, 1974. [Google Scholar]
  68. Brooke, J. SUS: A “Quick and Dirty” Usability Scale. In Usability Evaluation in Industry; CRC Press: Boca Raton, FL, USA, 1996; pp. 207–212. [Google Scholar]
  69. Poncet, A.; Courvoisier, D.S.; Combescure, C.; Perneger, T.V. Normality and Sample Size Do Not Matter for the Selection of an Appropriate Statistical Test for Two-Group Comparisons. Methodology 2016, 12, 61–71. [Google Scholar] [CrossRef]
  70. Lewis, J.R. Multipoint scales: Mean and median differences and observed significance levels. Int. J. Hum.-Comput. Interact. 1993, 5, 383–392. [Google Scholar] [CrossRef]
  71. Bartlett, K.A.; Camba, J.D. Gender Differences in Spatial Ability: A Critical Review. Educ. Psychol. Rev. 2023, 35, 8. [Google Scholar] [CrossRef]
  72. Peters, M.; Lehmann, W.; Takahira, S.; Takeuchi, Y.; Jordan, K. Mental Rotation Test Performance in Four Cross-Cultural Samples (N = 3367): Overall Sex Differences and the Role of Academic Program in Performance. Cortex 2006, 42, 1005–1014. [Google Scholar] [CrossRef] [PubMed]
  73. Rojas-Sánchez, M.A.; Palos-Sánchez, P.R.; Folgado-Fernández, J.A. Systematic Literature Review and Bibliometric Analysis on Virtual Reality and Education. Educ. Inf. Technol. 2023, 28, 155–192. [Google Scholar] [CrossRef] [PubMed]
  74. Bartlett, K.A.; Camba, J.D. An Argument for Visualization Technologies in Spatial Skills Assessment. In Proceedings of the Learning and Collaboration Technologies. Designing the Learner and Teacher Experience, Online, 26 July 2022; Volume 13328, pp. 30–39. [Google Scholar]
Figure 1. Application environment.
Figure 1. Application environment.
Applsci 13 08426 g001
Figure 2. Application menu.
Figure 2. Application menu.
Applsci 13 08426 g002
Figure 3. Displacement of primitives.
Figure 3. Displacement of primitives.
Applsci 13 08426 g003
Figure 4. Rotation of primitives.
Figure 4. Rotation of primitives.
Applsci 13 08426 g004
Figure 5. Scaling of primitives.
Figure 5. Scaling of primitives.
Applsci 13 08426 g005
Figure 6. The collaborative model with three users participating.
Figure 6. The collaborative model with three users participating.
Applsci 13 08426 g006
Figure 7. Models built during the VR activity.
Figure 7. Models built during the VR activity.
Applsci 13 08426 g007
Figure 8. Example question from the MRT (Vandenberg and Kuse, 1978) [66]. The circle on the far left contains the shape in question. The test-taker must select the two shapes from answer choices (AD) which are the same shape as the shape in question, but rotated into a different position.
Figure 8. Example question from the MRT (Vandenberg and Kuse, 1978) [66]. The circle on the far left contains the shape in question. The test-taker must select the two shapes from answer choices (AD) which are the same shape as the shape in question, but rotated into a different position.
Applsci 13 08426 g008
Figure 9. Sample problem from the DAT:SR. One of the answer choices labelled (AD) is a possible viewpoint of the box folded from the flat pattern on the left of the vertical line.
Figure 9. Sample problem from the DAT:SR. One of the answer choices labelled (AD) is a possible viewpoint of the box folded from the flat pattern on the left of the vertical line.
Applsci 13 08426 g009
Figure 10. Scatterplot of pre-test scores on the MRT and DAT:SR.
Figure 10. Scatterplot of pre-test scores on the MRT and DAT:SR.
Applsci 13 08426 g010
Figure 11. Scatterplot of post-test scores on the MRT and DAT.
Figure 11. Scatterplot of post-test scores on the MRT and DAT.
Applsci 13 08426 g011
Table 1. Questions used in the usability assessment (SUS).
Table 1. Questions used in the usability assessment (SUS).
Q1I would use this tool frequentlyQ6The tool is inconsistent
Q2I find this tool unnecessarily complexQ7Most people could learn how to use this tool very quickly
Q3The tool was easy to useQ8The tool is very difficult to use
Q4I would need the help of somebody with technical knowledge of this toolQ9I feel confident using this tool
Q5The tool’s functionality is well integratedQ10I had to learn many things before I could use this tool
Table 2. Questions used in the functionality assessment.
Table 2. Questions used in the functionality assessment.
Q1Navigating the command menus was not a problem for meQ6Drawing in 3D was not a problem for me
Q2Navigating the scene was not a problem for meQ7Deleting an object was not a problem for me
Q3Changing the scale of the scene was not a problem for meQ8Editing an object was not a problem for me
Q4Finding the command I was looking for was not a problem for meQ9Copying an object was not a problem for me
Q5Creating a 3D primitive was not a problem for meQ10Undoing and redoing the last command was not a problem for me
Table 3. Mean scores on the MRT and DAT pre- and post-tests.
Table 3. Mean scores on the MRT and DAT pre- and post-tests.
N Mean SD
MRT-Pre 29 19.24 9.28
MRT-Post 29 25.79 7.97
DAT-Pre 29 32.52 10.86
DAT-Post 29 33.07 11.32
Table 4. Results of usability assessment (1 = strongly disagree, 5 = strongly agree).
Table 4. Results of usability assessment (1 = strongly disagree, 5 = strongly agree).
QuestionAverage ScoreStandard Deviation
Q1. I would use this tool frequently4.790.41
Q2. I find this tool unnecessarily complex1.000
Q3. The tool was easy to use4.690.46
Q4. I would need the help of somebody with technical knowledge of this tool1.310.46
Q5. The tool’s functionality is well integrated4.900.30
Q6. The tool is inconsistent1.140.34
Q7. Most people could learn how to use this tool very quickly4.790.41
Q8. The tool is very difficult to use1.030.18
Q9. I feel confident using this tool4.140.78
Q10. I had to learn many things before I could use this tool1.170.38
Table 5. Results of the functionality assessment (1 = strongly disagree, 5 = strongly agree).
Table 5. Results of the functionality assessment (1 = strongly disagree, 5 = strongly agree).
QuestionAverage ScoreStandard Deviation
Q1. Navigating the command menus was not a problem for me4.690.46
Q2. Navigating in the scene was not a problem for me5.000
Q3. Changing the scale of the scene was not a problem for me3.900.80
Q4. Finding the commands I was looking for was not a problem for me4.970.18
Q5. Creating a 3D primitive was not a problem for me5.000
Q6. Drawing in 3D was not a problem for me5.000
Q7. Deleting an object was not a problem for me5.000
Q8. Modifying an object was not a problem for me4.310.46
Q9. Copying an object was not a problem for me1.000
Q10. Undoing and redoing the last command was not a problem for me1.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Conesa, J.; Mula, F.J.; Bartlett, K.A.; Naya, F.; Contero, M. The Influence of Immersive and Collaborative Virtual Environments in Improving Spatial Skills. Appl. Sci. 2023, 13, 8426. https://doi.org/10.3390/app13148426

AMA Style

Conesa J, Mula FJ, Bartlett KA, Naya F, Contero M. The Influence of Immersive and Collaborative Virtual Environments in Improving Spatial Skills. Applied Sciences. 2023; 13(14):8426. https://doi.org/10.3390/app13148426

Chicago/Turabian Style

Conesa, Julián, Francisco José Mula, Kristin A. Bartlett, Ferran Naya, and Manuel Contero. 2023. "The Influence of Immersive and Collaborative Virtual Environments in Improving Spatial Skills" Applied Sciences 13, no. 14: 8426. https://doi.org/10.3390/app13148426

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop