Next Article in Journal
Leakage Power Attack-Resilient Design: PMOS-Reading 9T SRAM Cell
Previous Article in Journal
Long-Range Wireless Power Transfer for Moving Wireless IoT Devices
Previous Article in Special Issue
Medical Education Escape Room Aligned with Flipped Classroom and Powered by Mobile Augmented Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Learning by Doing in VR: A User-Centric Evaluation of Lathe Operation Training

1
Departamento de Estructuras, Construcción y Expresión Gráfica, Universidad Politécnica de Cartagena, 30202 Murcia, Spain
2
Instituto Universitario de Investigación en Tecnología Centrada en el Ser Humano, Universidad Politécnica de Valencia, 46022 Valencia, Spain
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(13), 2549; https://doi.org/10.3390/electronics13132549
Submission received: 11 May 2024 / Revised: 17 June 2024 / Accepted: 26 June 2024 / Published: 28 June 2024

Abstract

:
This study presents the development and evaluation of an immersive virtual reality (VR) application designed for lathe operation training. The VR application, built using Unity for Oculus Rift headsets, aims to simulate a realistic lathe machining experience, allowing users to interact with the machine’s various controls and levers. The experimental analysis involved 20 s-year Mechanical Engineering students who performed machining tasks in the virtual environment. The usability and user experience of the application were assessed using the System Usability Scale (SUS) and a 12-item questionnaire. The SUS results yielded a high mean score of 96.25 (SD = 6.41), indicating excellent usability. The user experience evaluation also showed positive feedback, with high ratings for the sense of presence, realism, and usefulness for training purposes. However, some users reported minor physical discomforts such as dizziness. The study concludes that immersive VR is a valuable tool for enhancing training in lathe operations, offering an engaging and realistic experience that encourages active learning. Future work should focus on reducing physical discomfort and further improving the application’s realism and interactivity.

1. Introduction

Virtual reality has its roots in stereoscopic vision, which emerged as an innovative branch in the 19th century. The first stereoscope was invented by the British scientist Charles Wheatstone in 1838. This device was a kind of glasses that simulated a three-dimensional image [1]. Twelve years later, the first stereoscopic camera appeared, allowing the capture of two images simultaneously, which, when viewed through a special viewer, gave a sense of depth [2].
In the 20th century, more sophisticated systems oriented toward virtual reality began to emerge. For instance, The Teleyeglasses, a portable television worn on the head like glasses, was developed in 1963 by Hugo Gernsback. These glasses allowed the user to watch television stereoscopically, although the device could not respond to head movements to provide new perspectives [3]. Another significant development was The Sword of Damocles, considered the true precursor to modern virtual reality glasses. This device consisted of a mechanical arm attached to the ceiling that followed the head’s rotation while looking around. It held a helmet with two CRT screens that displayed wireframe graphics to the user [4].
The 1980s and 1990s were a period of great boom for video games. However, attempts to develop devices that would allow users to immerse themselves in the scene, such as Sega’s in 1993 and Nintendo’s Virtual Boy console in 1995, failed [5]. It was not until 2012 that Palmer Luckey developed his first Oculus Rift development kit (DK1), which sparked the growing development of such devices, initially oriented towards the entertainment world [6].
Subsequent advances along with the decreasing cost of this technology mean that virtual reality has become an affordable technology in many areas [7,8], for example, supporting learning and teaching processes in both academic and industrial settings. Leonard and Fitzgerald [9] highlighted the increase in participation and satisfaction in learning with the use of virtual reality. Chang et al. [10] and Mateu and Alaman [11] emphasized students’ motivation and improvement in problem-solving skills with the use of virtual reality. Additionally, Chao et al. [12], Enyedy et al. [13], and Han et al. [14] underscored the advantages of virtual reality for the development of sensory-based learning.
Examples from different disciplines and levels include work in medicine by Huber et al. [15], in rehabilitation by Keefe [16], in industrial training by Matsas and Vosniakos [17], in the study of evacuation plans by Mol et al. [18], in urban planning by Zhang [19], in journalism by De la Peña et al. [20], in sports by Staurset and Prasolova-Forland [21], in geometry by Lai et al. [22], in design by Stefan [23], in biology by Ai-Lim [24], in physics by Van der Linden [25], and in engineering by Abulrub [26].
In professional training in the industry, virtual reality applications have primarily aimed to reduce failures and accidents. This is achieved by improving employees’ ability to react to technical problems [27], reinforcing learning from practical training to avoid accidents during maintenance work, and minimizing damage to industrial process equipment during operation [28]. Virtual tours of plants allow for observation of operations and identification of potential failures before going live [29].
According to García et al. [30], virtual reality in industry combines the design of industrial processes with computer simulation, providing new learning and training opportunities, faster adaptation to new systems, customization of training requirements, cost reduction, and preservation of worker integrity. Notable work in this field includes research by Chrysoulas et al. [31], who developed a virtual/augmented reality laboratory for teaching industrial automation concepts, validating maintenance procedures, and verifying assembly and manufacturing processes. Cvetkovski et al. [32] created an application where users can assemble an electric motor with 3D parts in a virtual environment, visualize a package sorting process, and identify the construction characteristics of an actuator and its selection process. García et al. [33] developed a virtual reality application to simulate the operation of an Oil & Gas system, allowing users to configure HART instruments in calibration and field installation. Morales-Menéndez and Ramírez-Mendoza [34] developed a hybrid proposal combining a virtual and remote laboratory, focusing on industrial networks, integrated manufacturing systems, and automation. Román-Ibañez et al. [35] developed a simulator of industrial robotic arms for designing, visualizing, monitoring, and performing security controls. Bin et al. [36] proposed the three-dimensional simulation of a robotic arm, giving users a virtual experience without needing a real one.
Special attention is paid to applications that allow interaction with both virtual and real-world objects. Wang and Wang [37] developed a virtual building for implementing automated systems, with automation programmed in the virtual environment being sent to a Siemens S7-300 PLC, which, in turn, communicates with the building via the TCP/IP protocol. Chen et al. [38] developed a logic control mechanism integrated with virtual reality that synchronously works with a real development, exchanging data with a PLC. Togias et al. [39] created a teleoperation system for designing and controlling robots in a production station, which, upon validation in virtual reality, can send the code to the robot controller for actual execution. Kalinov et al. [40] presented WareVR, a human–robot interface based on a virtual reality application for interaction with an automated robotic system.
According to Bekele and Champion [41], it is essential to differentiate between non-immersive virtual reality, where the user observes through a screen, and immersive virtual reality, also known as an immersive 3D environment (Stock et al. [42]), where the user is part of a virtual space and can interact with the 3D content (Loeffer and Anderson [43]).
Many of the previously mentioned works implement non-immersive virtual environments, with fewer being developed in immersive environments. Some notable immersive applications include the work by Allmacher et al. [44], who developed an application in Unity 3D using HTC Vive HMDs, allowing users to interact in the virtual environment and perform functional tests to validate vehicle guidance systems before constructing real prototypes, saving resources. Garcia et al. [30] developed an FESTO pneumatic lab in immersive virtual reality with the help of the Oculus Rift VR headset using the MQTT protocol on a Raspberry Pi to work synchronously with the real module. Aldea et al. [45] created an immersive VR application for evaluating automated vehicle HMI designs, enabling researchers and designers to build virtual prototypes and evaluation scenarios quickly and resource efficiently. Malik et al. [46] explored virtual reality technological development for designing industrial production systems based on integration with simulated human–robot environments, with the virtual model created in Siemens NX exported to an immersive virtual environment for HTC Vive headsets. Pruna et al. [27] developed a simulator in Unity 3D and Labview, using Oculus Rift headsets to emulate the real process of a plant, with monitoring and control through HMI and TCP/IP protocol. Vergara et al. [47] published a set of rules with the objective of serving as a guide for the creation of virtual reality applications, and Hernández-Chávez et al. [48] presented a virtual tool focused on students of the academic career of Automotive Systems Engineering.
In this paper, we introduce an innovative application that enables users to machine specimens on a lathe within an immersive virtual reality environment. By leveraging visual, auditory, and tactile interactions, this application provides a comprehensive understanding of all system components [49]. While similar commercial developments are on the market, as noted in references [50,51,52,53], scientific publications on these are limited. Our work stands out in its detailed simulation of lathe machining, comparable to the approach in [54], and offers a thorough usability study. Unlike the study in [55], which lacks depth in its exploration of the virtual reality tool, our application focuses extensively on the user experience and system features.

2. Materials and Methods

Our research focuses on an experimental analysis that quantitatively measures the acceptance of immersive virtual reality as a tool for improving user training in lathe operation. To achieve this, we worked with a group of 20 s-year students of the Bachelor’s Degree in Mechanical Engineering, 4 women and 16 men, aged between 19 and 21 years. All of them volunteered to participate in one of the practices proposed in a course on Production System Engineering, using the virtual reality application instead of the real lathe in the laboratory. All participants had previously completed a practice using a real lathe. After the virtual experience, the students answered 12 questions extracted from four well-recognized tests aimed at evaluating the user experience. The results obtained from these tests were used to draw the conclusions presented in this study. To assess whether the software used in the virtual reality experience was uncomfortable for the students, potentially influencing the test results, the students were also asked to complete an additional test to evaluate the tool’s usability.

2.1. Lathe Reference Model

The VR application used in this study has been developed based on the CMZ T-360 lathe, a widely used mechanical or parallel lathe in educational and training environments. This type of lathe operates by machining material through chip removal, where the tool advances while the workpiece, anchored to the machine, rotates at specific revolutions.
The following points correspond to the numbered components shown in Figure 1, which describe the elements considered in the virtualization process.
  • Bed: The foundation on which all the lathe’s components and carriages are mounted.
  • Headstock: Fixed to the bed, it supplies power to the spindle.
  • Spindle: Electrically powered by the headstock, it secures the workpiece for rotation using adjustable chucks or jaws.
  • Longitudinal Carriage: Moves the tool longitudinally along the workpiece.
  • Longitudinal Carriage Handwheel: Allows manual movement of the longitudinal carriage.
  • Cylinder and Facing Lever: Enables automatic movement of the longitudinal carriage based on the selection made on the bed controls.
  • Cross Carriage: Moves the tool transversely to the spindle and can be operated automatically with the cylinder and facing lever.
  • Cross Carriage Handwheel: Allows manual movement of the cross carriage.
  • Tool Post: Holds the tool firmly and moves longitudinally along the workpiece.
  • Tool Carriage Handwheel: Allows manual movement of the tool carriage.
  • Tailstock Carriage: Secures the workpiece to prevent misalignment.
  • Tailstock Lock Crank: Releases or locks the tailstock carriage on the bed.
  • Tailstock Quill Crank: Facilitates disassembly and replacement of the tailstock quill with the centering bit.
  • Tailstock Handwheel: Moves the tailstock axis.
  • Main Power Switch: Controls the overall power supply to the machine.
  • Spindle On/Off Switch: Controls the rotation of the spindle.
  • Emergency Stops: Includes both manual and foot-activated emergency stops.
  • Protective Screen: Prevents chips from escaping and stops the process if lifted during machining.
The operations that can be performed using this lathe, and which are the focus of this study, include:
  • Cylindrical Turning: Achieving cylindrical surfaces by moving the tool parallel to the axis of the workpiece.
  • Facing and Parting Off: Producing flat surfaces by moving the tool perpendicular to the axis of rotation of the workpiece.

2.2. Virtual Reality Application Implementation

2.2.1. Hardware and Software Components

The application used for the virtual experience was run on laptops equipped with the Windows 10 ® 64-bit operating system, an Intel® Core™ i7-8750H 2.20 GHz processor, 8 GB of RAM, and an NVIDIA® GeForce® GTX 1080 Ti graphics card. To achieve an immersive virtual experience, Oculus Rift™ headsets were utilized. These headsets feature OLED lenses with a resolution of 2160 × 1200 pixels and a refresh rate of 90 Hz. For interaction, two TouchController™ devices were used, equipped with sensors that detect finger positions, allowing differentiation between grip and pointer actions. The setup also included two Oculus Constellation™ sensors to track the movements of the TouchController as seen in Figure 2.
In our study, Unity® 2022.3.10f1 (San Francisco, CA, USA) was chosen as the development platform for creating the virtual reality (VR) application due to its widespread use and extensive developer community. To achieve an immersive environment, we utilized Unity’s built-in support for Oculus™ (Irvine, CA, USA), which includes several critical components: the OVRCameraRig for VR camera control, a unified input API for managing controller inputs, and a first-person prefab for user interaction. Additionally, we incorporated CustomHandRight and CustomHandLeft to ensure precise hand presence within the VR environment.
We further enhanced our project by integrating the VRTK™ (Virtual Reality Toolkit, San Francisco, CA, USA) package to boost productivity and accelerate the development process. This toolkit was particularly valuable for defining joints, such as the Rotational Joint Drive and Directional Joint Drive, and for implementing a sophisticated gripping system. These features allowed us to manipulate virtual components effectively using the CustomHandRight and CustomHandLeft models.
To facilitate user navigation within the virtual scene, we implemented the teleportation system provided by Oculus™. This was achieved by integrating the ObjectPointer.Curved prefab and defining the PointerFacade script. This setup enabled users to move through the environment using the OculusTouch™ controllers. Additionally, we employed the AxisRotator script to enhance maneuverability, allowing users to select their orientation during teleportation.

2.2.2. Details of the Implementation

To immerse the user in an industrial environment, we situated the lathe within a virtual industrial building, navigable using the first-person mode supported by Unity for Oculus. The industrial building was modeled in Autodesk® (San Fransico, CA, USA) Revit® (New York, NY, USA) 2023, a widely adopted tool in the industry and freely available for academic use, which allows for direct export of the model in FBX format, compatible with Unity.
The lathe, presented in Figure 3a, was modeled to natural scale based on precise measurements taken directly from the machine. This modeling was carried out in SolidWorks® 2023 (Waltham, MA, USA), a piece of leading mechanical design software. To ensure the mobility of each component, the lathe was modeled as individual parts, which were then assembled, establishing positional relationships to simulate realistic movements.
To import SolidWorks files into Unity, we utilized Pixyz® (San Fransico, CA, USA), a tool specifically designed to import three-dimensional engineering and architectural documentation into game engines. Pixyz offers seamless integration with the Unity editor, enabling the import of textures, colors, and the hierarchical structure of assemblies. This allows for detailed access to the different components within the assemblies (see Figure 3b).
By combining these advanced modeling and integration techniques, we have created a highly realistic and interactive virtual environment for lathe operation training, leveraging industry-standard tools to ensure accuracy and usability. This approach enhances the training experience by providing users with a detailed and immersive industrial setting.

Specimens

Before operating the lathe, it is essential to obtain the specimen that will be machined. We included a vending machine in the virtual environment to streamline this process, as depicted in Figure 4. This vending machine allows users to select the dimensions of the desired specimen, providing a convenient and interactive method for preparing the materials needed for machining.
This addition enhances the realism and functionality of the training environment, enabling users to experience the full workflow of lathe operation from specimen selection to final machining. By simulating this preparatory step, the virtual reality application offers a comprehensive and immersive training experience, ensuring users are well acquainted with all aspects of the machining process.
By interacting with the buttons, users can define the height and radius of the specimen to be machined. Once the desired dimensions are set, pressing another button will produce the corresponding specimen.
The action of pushing these and subsequent buttons, which will be detailed in this document, was implemented using the DirectionalJointDrive prefab. This prefab allows an object to move in a specified direction, with the button’s position within defined maximum and minimum limits determining its on/off status.
After creating the specimen, it must be picked up or grabbed. Oculus Integration includes OVRGrabbable and OVRGrabber scripts for the CustomHandRight and CustomHandLeft prefabs for interacting with objects in the scene. However, we opted to disable these tools in favor of VRTK’s grappling system, which offers more control. This system requires simply adding all the objects one wants to interact with to the Interactable.Primary_Grab.secondary_Swap script.

Spindle

The spindle is the component to which the workpiece is securely fixed, utilizing a chuck or jaw system. This system is operated by a wrench used to open and close the jaws.
To enable the insertion of the wrench into any of the chuck’s drilled holes, we defined a collider for each hole and at the end of the wrench as seen in Figure 7a. When these colliders interact, it indicates the user’s intention to insert the wrench, triggering the following events:
  • A new wrench, fixed to the spindle and aligned with the selected hole, becomes visible.
  • The mesh renderer of the main wrench the user is handling is hidden to avoid duplication.
  • A flag is activated to allow the jaws to move.
When the collision between the colliders ceases, it signifies the user’s intention to remove the wrench, reversing the above events.
When the wrench is inserted into a chuck hole as shown in Figure 7b, it follows the rotation of the user’s virtual hand. The rotation is tracked by reading the quaternion of the hand. The direction of the wrench’s rotation is determined by the vector resulting from the cross-product of the wrench’s vector before and after the hand’s rotation. This direction dictates the movement of the jaws.
The displacement of the jaws occurs whenever the wrench is inserted into a hole, indicated by an activated flag, and continues until the jaws grip the workpiece.
For inserting the specimen into the spindle, a dedicated collider detects the user’s intention to insert the specimen. When this collider is triggered, and the user releases the specimen, it is automatically centered and fixed in position on the spindle as seen in Figure 8.
To enhance realism, the jaws are programmed to secure the specimen only when they are sufficiently tightened around its radius. For user convenience, the jaw opening is displayed numerically.
Both the wrench used on the spindle and the specimen must be configured using the Interactable.Primary_Grab.secondary_Swap script to enable virtual hand gripping.

Tailstock Carriage

The tailstock tip or centering bit can be mounted on the tailstock carriage, with both elements configured under the Interactable.Primary_Grab.secondary_Swap script to allow virtual hand manipulation.
A collider is introduced at the end of the shaft to detect when any of these objects are inserted. When a user inserts an object with virtual hands, presented in Figure 9a,b, and releases it, the object is fixed onto the shaft.
A crank handle located on the tailstock carriage, which allows virtual hand gripping, enables the fixation of elements to the axis as seen see Figure 9c. The action of this crank handle interacts with the colliders of the object fixed to the shaft, allowing or preventing user manipulation.
The tailstock carriage features a collider that allows interaction for movement along the bed guides. When the virtual hand grips the collider, the carriage moves accordingly. This movement is constrained between the end of the workpiece and the end of the sliding guides.
To lock the tailstock, the tailstock lever is used. When this lever, gripped by the virtual hand, collides with a collider at the end of its travel, an event is triggered that deactivates the tailstock collider, preventing further movement of the carriage (see Figure 10a,b).
Additionally, the tailstock carriage’s steering wheel is equipped with a function similar to that of the chuck key. This enables the carriage to move forward or backward based on the steering wheel’s rotation direction. The protruding horizontal handle of the steering wheel is designed for virtual hand gripping, allowing precise control over the carriage’s movement as seen in Figure 10c.

Startup

To start the lathe, the switches shown in Figure 11 must be used. From left to right, these switches are a main power switch that supplies power to all elements connected to the lathe, a pushbutton that exclusively powers the lathe chuck, and a startup button that activates the lathe chuck. This arrangement ensures a controlled and sequential startup process, enhancing safety and operational efficiency in the virtual training environment.
For the chuck to start, all emergency stops must be in the off position. This condition is verified by checking the value of a Boolean variable associated with each emergency stop. There are four emergency stops: three voluntary and one involuntary. Voluntary emergency stops (see Figure 12a) are activated by manual or foot switches (note that the foot switch is ineffective in our simulation due to the lack of simulation hardware). The involuntary emergency stop occurs when the user lifts the protective screen of the lathe as seen in Figure 12b.
All the switches and the protective screen mentioned are user manipulable, incorporating either the RotationalJointDrive or DirectionalJointDrive prefab depending on whether the action required is rotational or linear.

Longitudinal and Transverse Carriages and Tool Holder

The displacement of the longitudinal, transverse, and tool holder carriages is implemented using the same function as the tailstock carriage, allowing movement in either direction based on the rotation direction, within predefined limits as seen in Figure 13. Additionally, the steering wheels of these carriages are included in the Interactable.Primary_Grab.secondary_Swap script, enabling them to be gripped and controlled by the user.

Automatic Cylindrical Turning and Facing

The lathe supports automatic cylindrical turning and facing operations via a lever. This lever is part of the Interactable.Primary_Grab.secondary_Swap script, enabling user interaction, and is connected using the RotationalJointDrive script. The lever’s position is determined by two colliders located at its endpoints as represented in Figure 14.
The lathe supports automatic cylindrical turning and facing, controlled by a lever included in the Interactable.Primary_Grab.secondary_Swap script and connected using the RotationalJointDrive script. The lever’s position is determined by colliders at its endpoints as seen in Figure 14.
These operations automatically move the carriages, with the progress determined by settings on the bench. Colliders are redefined to establish the movement boundaries for the carriages.
The bench contains three levers and the Norton box, which define the carriages’ automatic feed based on the selected operation. These components share the same properties as previously analyzed levers, with their positions determined by colliders as seen in Figure 15.
The displacement value of the carriages is determined by the position of the levers and is referenced in two diagrams located on the headstock. To facilitate user reading, these diagrams can be enlarged by the user clicking on them, functioning as interactive buttons as presented in Figure 16.

Sound and Particle Systems

To enhance the virtual experience, a sound system was implemented that replicates the noise of the lathe during startup and machining. These sounds were recorded directly in the laboratory and are played in a loop during the machining process. Additionally, the movement of various levers is accompanied by characteristic sounds whenever their positions change, including the coupling of the torque wrench on the spindle jaws.
A particle system has also been created to emulate the chips generated during machining. This system activates whenever the cutting tool is in contact with the specimen, providing a realistic visual representation of the machining process.
These audio and visual enhancements significantly improve the realism and immersion of the virtual training environment, closely simulating real-world lathe operations.

2.2.3. Operation of the Virtual Lathe

The various components of the lathe can be manually moved, as described in the previous sections, to perform the desired machining. Once an operation is initiated, either manually or automatically, according to the programs allowed by the CMZ T-360 lathe, the virtual lathe operation is simulated by the interaction of the tool placed on the tool post and the specimen. Colliders were placed on both parts to facilitate this.
To simulate the machining operation, the original test tube consists of an assembly of basic elements, which depends on the selected specimen length, as shown in Figure 17. These elements are formed by a filled disc that fits inside a hollow cylinder, whose height has been determined experimentally.
During a machining operation, when the tool’s collider intersects with the collider of the hollow cylinder of a basic element, the filled disc scales become tangent to the tool, while the hollow cylinder becomes hidden to simulate the machining process. If the tool changes position and collides with the filled disc, the disc’s scale adjusts to remain tangent to the tool. By applying these operations to the various basic elements the tool interacts with, it is possible to perform external cylindrical turning, face turning, and external taper turning.
Conversely, during a machining operation, when the tool collides with the filled disc of a basic element, the exterior cylinder scales to become internally tangent to the tool, while the interior disc becomes hidden to simulate the machining operation. If the tool changes position and collides with the hollow cylinder, the cylinder’s scale adjusts to remain internally tangent to the tool. By applying these operations to the different basic elements the tool interacts with, it is possible to perform internal cylindrical turning.

2.2.4. Activity with the Virtual Lathe

The activity conducted during the virtual reality experience involved machining the part shown in Figure 18 on the virtual lathe without considering dimensional or geometric tolerances. This concise and focused activity provided users with hands-on practice in virtual machining, enhancing their understanding and skills in lathe operations.
Each student individually participated in the virtual lathe machining experience under the supervision of a professor. The professor monitored the operations projected on a screen, making observations and noting students’ comments in real time as seen in Figure 19.
This setup allowed for direct oversight and immediate feedback, enhancing the learning experience by providing personalized guidance and capturing insightful observations.
Although there was no time limit for completing the practice, most students finished within a 50 min session. Prior to this, since none of the students had previous experience with virtual reality, each was given a 20 min practice session with the application, supervised by a professor.
Since all students had prior experience with the real lathe, no additional explanations about the lathe’s operation were necessary. The training focused solely on how to wear the HMD and use the controllers.
This preparation ensured that students were comfortable with the virtual reality setup, allowing them to effectively engage with the virtual lathe training.

2.3. Evaluation of the User Experience

To evaluate the user experience, we used the 12-item questionnaire shown in Table 1, which was based on questions posed in the following prestigious tests:
  • QUESI (Hurtienne and Naumann [56]), to evaluate the intuitiveness or otherwise of product, software, and game developments (Q10 and Q11, questions).
  • ITC-SOPI (Lessiter et al. [57]), to assess participants’ sense of presence and participation in the virtual environment, whether the content is perceived as realistic, and the appearance of negative effects (Q1, Q2, Q3, Q4, Q5, and Q6 questions).
  • NASA-TLX (Hart and Staveland [58]), to assess the subjective workload experienced by participants (Q7 and Q8 questions).
  • UEQ (Laugwitz, [59]), to assess the subjective user experience (Q9 and Q12 questions).
Participants answered questions on a Likert scale ranging from 1 to 5, where 1 indicates strong disagreement and 5 indicates strong agreement.
To analyze the potential influence of the designed software’s usability on user experience evaluations, we used the System Usability Scale (SUS) created by Brooke [60]. The SUS is a reliable tool for assessing the usability of any device, application, or product. It involves responding to a series of Likert-scale questions, with responses from 1 (strongly disagree) to 5 (strongly agree). The specific questions used in our usability assessment are listed in Table 2.

3. Results

The results for the assessment of the user experience, detailed in Table 3, indicate that users found the virtual reality lathe simulation to be highly realistic and immersive, with strong agreement on the sensation of presence (M = 5.0, SD = 0.0) and realism of the environment (M = 5.0, SD = 0.0). The usability of the application was rated highly (M = 4.75, SD = 0.79), and most users found it effective for training purposes (M = 4.75, SD = 0.79). Minimal discomfort (M = 1.15, SD = 0.49) and perceived risk (M = 1.15, SD = 0.49) further support the viability of the virtual reality application as a training tool. The only area with more mixed feedback was the ease of operating the virtual lathe compared to the real lathe (M = 4.05, SD = 0.83), suggesting a need for slight adjustments in the interface or controls.
The higher standard deviations in the responses to Q10, Q11, and Q12 indicate a greater variability in user experiences regarding usability and the application’s validity for training, highlighting areas for potential improvement. In Figure 20, violin plots for the questions with SD > 0 are presented. It offers a clearer view of the distribution of the responses provided by the participants in the study and confirms that Q11, on the easiness of the operation of the virtual lathe vs. the real one, is the question that offers more variability.
Regarding the results of the SUS questionnaire shown in Table 4, the mean SUS score of 96.25 indicates the excellent usability of the virtual reality lathe simulation. According to industry standards, SUS scores above 85 are considered excellent, which places this application well within that range. The high mean score suggests that users found the application highly usable and user friendly. The standard deviation of 6.41 reflects the variability in the responses. Given the high mean score, this relatively low standard deviation indicates consistent positive feedback across most users. The minimum score of 80 still falls within the acceptable range, though it is slightly lower than the excellent threshold, suggesting that, while some users found minor usability issues, overall satisfaction remained very high.
Figure 21 shows the violin plot for SUS questions with SD > 0 and SUS score, which reflect the low variability in the responses, except in those to questions SUS-Q1, SUS-Q3, and SUS-Q9. As shown in Table 3, all positive questions (even questions) received average scores equal to or higher than 4.5, and all negative questions (odd questions) received scores equal to or lower than 1.1, indicating that users found the app to have good usability.
A Spearman’s rank-order correlation was run to determine the relationship between SUS score and the questions used for the assessment of the user experience with SD > 0. The results are presented in Table 5.
There was a moderate, positive correlation between questions Q6 (lighting effects), Q10 (usability and friendliness), and Q12 (validity for training) and SUS score, which was statistically significant (rs(18) = 0.58, p = 0.008) in the three cases. There was also a moderate, negative correlation between question Q1, related to discomfort during the experience (rs(18) = −0.58, p = 0.008), and Q8, related to risk perception (rs(18) = −0.57, p = 0.008). Precisely, both variables are strongly related since the origin of the risk associated with the application (Q8) is associated with cybersickness situations (Q1) (rs(18) = 0.99, p < 0.001).

4. Discussion

A virtual lathe machining application was developed to allow users to manipulate different controls and levers using virtual hands connected to Oculus Rift devices. The application received highly positive feedback from most users due to its ease of use and active learning environment.
Several commercial developments are similar to our work, such as the one presented by Aura Interact [50]. However, there is no scientific study on its usability or functionality. Another virtual reality application [51] allows users to operate a lathe, but it lacks scientific publications on its usability and functionality. It follows tutorials for machining operations without allowing user freedom for independent machining. The controls are operated using cursors, reducing the effectiveness of the simulation.
Other applications [52,53] present virtual lathe machining in a game format, guiding users through predefined stages from tool and part identification to final machining. However, these also restrict user freedom. In [54], a virtual lathe machining application allows user freedom but requires three different types of specimens for operations that our application performs with a single type. There is no usability study for this application. In contrast, [55] analyzes the usability of a virtual lathe application, but it lacks sufficient detail about the lathe’s characteristics.
Our experience highlighted that excessive attempts to replicate reality can introduce unexpected issues. Two main problems arose from the use of gravity and faithful reproduction of the lathe:
  • Tool Drop Issues: The tightening key and specimen fell and became trapped within the lathe’s structure, occurring four times.
  • Re-Grabbing Difficulties: When tools fell to the floor, it was difficult to grab them again, leading to controller collisions with the ground.
Additionally, the application lacked the tools necessary for precision machining, such as micrometers and gauges, which prevented us from applying the tolerances indicated in the manufacturing plan.
Observations of user behavior during the virtual reality experience revealed three stages:
  • Amazement and expectation: Initial interaction with the virtual world, characterized by caution and thoughtfulness, lasting around 20 min.
  • Concentration and effort: Increased confidence and familiarity with the application, showing keen interest in proper operation and expected outcomes.
  • Excitement and curiosity: After achieving the expected results, users showed interest in exploring and experimenting with non-standard actions.
These stages suggest that virtual reality enhances user engagement and willingness to learn compared to conventional lathe operation. Users expressed confidence in the virtual tool, leading to experimentation, in contrast to a more measured approach with the real lathe.

5. Conclusions

The literature review highlights a clear trend toward using virtual environments for teaching and learning processes. Virtual reality (VR) applications for manufacturing practical sessions can mitigate the high costs associated with traditional teaching laboratories. Limited resources often force multiple students to share the same machinery during laboratory sessions, reducing individual participation. In some cases, the teacher operates the machinery while students merely observe. Additionally, the increasing adoption of VR across various fields makes its integration into education highly beneficial, introducing students to increasingly used technology.
Based on the results from the System Usability Scale (SUS) and user experience tests conducted after the VR experience, several key conclusions emerge. First, all students rated the application’s usability very positively. The high SUS scores indicate that usability did not negatively impact the user experience evaluations. The VR application was well received by a broad group of students, confirming that VR applications are generally accepted and appreciated in educational settings. Some users reported physical issues such as dizziness or fatigue, which negatively affected their VR experience. While the percentage of users sensitive to VR was relatively small (10%), it is essential to analyze and address these issues to improve the overall experience.
The immersive VR application presented in this study offers an experience very close to reality. Users felt part of a virtual world, with no perceived risks, and the machinery behaved as it would in real life. The positive feedback underscores the potential of VR in manufacturing laboratories involving machining operations.
Among the advantages of using the developed VR application are the following: VR allows for the establishment of laboratories with significantly lower investments compared to physical laboratories. Virtual environments eliminate the need for maintenance and consumables, reducing ongoing costs. Using VR in education familiarizes students with technologies that are increasingly prevalent in various fields. VR provides a safe environment for machinery operation, minimizing the risk of accidents. Users can repeat or conduct new practices independently, provided they have access to VR equipment.
However, there are some disadvantages. Some real-world sensations, such as the resistance of moving parts due to weight, are difficult to replicate in a virtual environment. Implementing VR in education requires additional effort from developers but offers substantial benefits in enhancing student engagement and providing a realistic, risk-free learning environment. This study demonstrates that VR is a valuable tool for modernizing and improving practical training in technical education.
Future improvements should focus on minimizing physical discomfort and further refining the application’s realism and interactivity, for example, providing haptic feedback through vibrations in the hand controllers. Addressing these areas will enhance the overall effectiveness of and user satisfaction with the developed application.

Author Contributions

Conceptualization, M.C., F.M. and J.C.; methodology, A.M. and J.C.; software, A.M. and F.M.; validation, A.M., M.C. and J.C.; formal analysis, A.M., M.C. and J.C.; investigation, A.M., M.C., F.M. and J.C.; resources, M.C., F.M. and J.C.; data curation, M.C., F.M. and J.C.; writing—original draft preparation, J.C.; writing—review and editing, M.C.; visualization, M.C., F.M. and J.C.; supervision, M.C. and J.C.; project administration, J.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The Code of Good Research Practices of the Universidad Politécnica de Cartagena, in section “12.1. Areas of research subject to specific regulations”, does not contemplate the need for prior authorization by the Ethics Committee of the University for a study of the characteristics described in this paper where two instructional techniques (physical vs. virtual equipment) are studied.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the corresponding author upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wheatstone, C. Contributions to the physiology of vision—Part the first. On some remarkable and hitherto unobserved phenomena of binocular vision. In Abstracts of the Papers Printed in the Philosophical Transactions of the Royal Society of London; The Royal Society: London, UK, 1843; pp. 76–77. [Google Scholar]
  2. Wade, N.J. Wheatstone and the origins of moving stereoscopic images. Perception 2012, 41, 901–924. [Google Scholar] [CrossRef]
  3. Ackerman, E. Before virtual reality was cool. IEEE Spectr. 2016, 53, 68. [Google Scholar] [CrossRef]
  4. Sutherland, I.E. A head-mounted three dimensional display. In Proceedings of the December 9–11, 1968, Fall Joint Computer Conference, Part I (AFIPS ‘68 (Fall, Part I)), San Francisco, CA, USA, 9–11 December 1968; Association for Computing Machinery: New York, NY, USA, 1968; pp. 757–764. [Google Scholar] [CrossRef]
  5. Boyer, S. A virtual failure: Evaluating the success of Nintendo’s Virtual Boy. Velv. Light Trap 2009, 64, 23–33. [Google Scholar] [CrossRef]
  6. Gleasure, R.; Feller, J. A rift in the ground: Theorizing the evolution of anchor values in crowdfunding communities through the Oculus Rift case study. J. Assoc. Inf. Syst. 2016, 17, 1. [Google Scholar] [CrossRef]
  7. Heim, M. Virtual Realism; Oxford University Press: Oxford, UK, 1998. [Google Scholar]
  8. Pantelidis, V.S. Virtual reality and engineering education. Comput. Appl. Eng. Educ. 1997, 5, 3–12. [Google Scholar] [CrossRef]
  9. Leonard, S.N.; Fitzgerald, R.N. Holographic Learning: A Mixed Reality Trial of Microsoft HoloLens in an Australian Secondary School. Res. Learn. Technol. 2018, 26. [Google Scholar] [CrossRef]
  10. Chang, C.-W.; Lee, J.-H.; Wang, C.-Y.; Chen, G.-D. Improving the authentic learning experience by integrating robots into the mixed-reality environment. Comput. Educ. 2010, 55, 1572–1578. [Google Scholar] [CrossRef]
  11. Mateu, J.; Alaman, X. CUBICA: An Example of Mixed Reality. J. Univers. Comput. Sci. 2013, 19, 2598–2616. [Google Scholar] [CrossRef]
  12. Chao, J.; Chiu, J.L.; DeJaegher, C.J.; Pan, E.A. Sensor-augmented virtual labs: Using physical interactions with science simulations to promote understanding of gas behavior. J. Sci. Educ. Technol. 2016, 25, 16–33. [Google Scholar] [CrossRef]
  13. Enyedy, N.; Danish, J.A.; DeLiema, D. Constructing liminal blends in a collaborative augmented-reality learning environment. Int. J. Comput. Support. Collab. Learn. 2015, 10, 7–34. [Google Scholar] [CrossRef]
  14. Han, J.; Jo, M.; Hyun, E.; So, H.J. Examining young children’s perception toward augmented reality in fused dramatic play. Educ. Technol. Res. Dev. 2015, 63, 455–474. [Google Scholar] [CrossRef]
  15. Huber, T.; Paschold, M.; Hansen, C.; Wunderling, T.; Lang, H.; Kneist, W. New Dimensions in Surgical Training: Immersive Virtual Reality Laparoscopic Simulation Exhilarates Surgical Staff. Surg. Endosc. 2017, 31, 4472–4477. [Google Scholar] [CrossRef] [PubMed]
  16. Keefe, F.; Huling, D.; Coggins, M.; Keefe, D.; Rosenthal, Z.; Herr, N.; Hoffman, H. Virtual Reality for Persistent Pain: A New Direction for Behavioral Pain Management. Pain 2012, 153, 2163–2166. [Google Scholar] [CrossRef] [PubMed]
  17. Matsas, E.; Vosniakos, G. Design of a Virtual Reality Training System for Human–robot Collaboration in Manufacturing Tasks. Int. J. Interact. Des. Manuf. 2015, 11, 139–153. [Google Scholar] [CrossRef]
  18. Mól, A.C.; Jorge, C.A.; Couto, P.M. Using a Game Engine for VR Simulations in Evacuation Planning. IEEE Comput. Graph. Appl. 2008, 28, 6–12. [Google Scholar] [CrossRef] [PubMed]
  19. Zhang, S.; Moore, A. The Usability of Online Geographic Virtual Reality for Urban Planning. In Innovations in 3D Geo-Information Sciences. Lecture Notes in Geoinformation and Cartography; Isikdag, U., Ed.; Springer: Cham, Switzerland, 2013. [Google Scholar] [CrossRef]
  20. De la Peña, N.; Weil, P.; Llobera, J.; Spanlang, B.; Friedman, D.; Sanchez-Vives, M.; Slater, M. Immersive Journalism: Immersive Virtual Reality for the First-Person Experience of News. Presence Teleoperators Virtual Environ. 2010, 19, 291–301. [Google Scholar] [CrossRef]
  21. Staurset, E.; Prasolova-Førland, E. Creating a Smart Virtual Reality Simulator for Sports Training and Education. Smart Innov. Syst. Technol. 2016, 59, 423–433. [Google Scholar] [CrossRef]
  22. Lai, C.; McMahan, R.P.; Kitagawa, M.; Connolly, I. Geometry Explorer: Facilitating Geometry Education with Virtual Reality. In Virtual, Augmented and Mixed Reality; Springer International Publishing: Cham, Switzerland, 2016; pp. 702–713. [Google Scholar]
  23. Stefan, L. Immersive Collaborative Environments for Teaching and Learning Traditional Design. Procedia Soc. Behav. Sci. 2012, 51, 1056–1060. [Google Scholar] [CrossRef]
  24. Ai-Lim Lee, E.; Wong, K.; Fung, C. How Does Fesktop Virtual Reality Enhance Learning Outcomes? A Structural Equation Modeling Approach. Comput. Educ. 2010, 55, 1424–1442. [Google Scholar] [CrossRef]
  25. Van der Linden, A.; van Joolingen, W. A Serious Game for Interactive Teaching of Newton’s Laws. In Proceedings of the 3rd Asia-Europe Symposium on Simulation & Serious Gaming—VRCAI ’16, New York, NY, USA, 3–4 December 2016; pp. 165–167. [Google Scholar] [CrossRef]
  26. Abulrub, A.; Attridge, A.; Williams, M. Virtual Reality in Engineering Education: The Future of Creative Learning. In Proceedings of the 2011 IEEE Global Engineering Education Conference (EDUCON), Amman, Jordan, 4–6 April 2011. [Google Scholar] [CrossRef]
  27. Pruna, E.; Rosero, M.; Pogo, R.; Escobar, I.; Acosta, J. Virtual reality as a tool for the cascade control learning. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics. Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2018; Volume 10850, pp. 243–251. [Google Scholar] [CrossRef]
  28. Burghardt, A.; Szybicki, D.; Gierlak, P.; Kurc, K.; Pietruś, P.; Cygan, R. Programming of industrial robots using virtual reality and digital twins. Appl. Sci. 2020, 10, 486. [Google Scholar] [CrossRef]
  29. Pérez, L.; Diez, E.; Usamentiaga, R.; García, D.F. Industrial robot control and operator training using virtual reality interfaces. Comput. Ind. 2019, 109, 114–120. [Google Scholar] [CrossRef]
  30. Garcia, C.A.; Caiza, G.; Naranjo, J.E.; Ortiz, A.; Garcia, M.V. An Approach of Training Virtual Environment for Teaching Electro-Pneumatic Systems. IFAC-PapersOnLine 2019, 52, 278–284. [Google Scholar] [CrossRef]
  31. Chrysoulas, C.; Homay, A.; Lemac, M. Teaching industrial automation concepts with the use of virtual/augmented reality-The IEC 61499 case. In Proceedings of the 17th International Conference on Information Technology Based Higher Education and Training (ITHET), Olhao, Portugal, 26–28 April 2018; pp. 1–6. [Google Scholar] [CrossRef]
  32. Cvetkovski, G.; Petkovska, L.; Di Barba, P.; Mognaschi, M.E.; Kamińska, D.; Firych-Nowacka, A.; Wiak, S.; Digalovski, M.; Celeska, M.; Rezaei, N.; et al. ViMeLa Project: An innovative concept for teaching mechatronics using virtual reality. Prz. Elektrotechniczny 2019, 95, 18–21. [Google Scholar] [CrossRef]
  33. Garcia, C.A.; Naranjo, J.E.; Ortiz, A.; Garcia, M.V. An approach of virtual reality environment for technicians training in upstream sector, IFAC-PapersOnLine 2019, 52, 285–291. IFAC-PapersOnLine 2019, 52, 285. [Google Scholar] [CrossRef]
  34. Morales-Menéndez, R.; Ramírez-Mendoza, R.A. Virtual/remote labs for automation teaching: A cost effective approach. IFAC-PaperONLine 2019, 52, 266–271. [Google Scholar] [CrossRef]
  35. Román-Ibáñez, V.; Pujol-López, F.A.; Mora-Mora, H.; Pertegal-Felices, M.L.; Jimeno-Morenilla, A. A low-cost immersive virtual reality system for teaching robotic manipulators programming. Sustainability 2018, 10, 1102. [Google Scholar] [CrossRef]
  36. Shen, B.; Wang, Y.; Zhang, X.; Cheng, H. Virtual Reality Design of industrial robot teaching based on unity3D. In Proceedings of the 7th International Symposium on Mechatronics and Industrial Informatics (ISMII), Zhuhai, China, 22–24 January 2021. [Google Scholar] [CrossRef]
  37. Wang, H.; Wang, Z. Research on PLC Simulation Teaching Platform Based on Unity. In Proceedings of the 2020 International Conference on Intelligent Design (ICID), Xi’an China, 11–13 December 2020; pp. 15–18. [Google Scholar]
  38. Chen, C.S.; Su, B.X.; Guo, M.H.; Zhong, Y.T.; Yang, Y.F.; Kuo, H.L. Applying virtual reality to control of logical control mechanism system. In Proceedings of the 2018 IEEE International Conference on Applied System Invention (ICASI), Tokyo, Japan, 13–17 April 2018; pp. 520–523. [Google Scholar] [CrossRef]
  39. Togias, T.; Gkournelos, C.; Angelakis, P.; Michalos, G.; Makris, S. Virtual reality environment for industrial robot control and path design. Procedia CIRP 2021, 100, 133–138. [Google Scholar] [CrossRef]
  40. Kalinov, I.; Trinitatova, D.; Tsetserukou, D. Warevr: Virtual reality interface for supervision of autonomous robotic system aimed at warehouse stocktaking. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–20 October 2021; pp. 2139–2145. [Google Scholar]
  41. Bekele, M.K.; Champion, E. A Comparison of Immersive Realities and Interaction Methods: Cultural Learning in Virtual Heritage. Front. Robot. AI 2019, 6, 91. [Google Scholar] [CrossRef] [PubMed]
  42. Stock, C.; Bishop, D.; O’Connor, A.N.; Cehn, T.; Pettir, C.J.; Aurambout, J.P. SIEVE: Collaborative Decision-making in an Immersive Online Environment. Cartogr. Geogr. Inf. Sci. 2008, 35, 133–144. [Google Scholar] [CrossRef]
  43. Loeffler, C.E.; Anderson, T. The Virtual Reality Casebook; Van Nostrand Reinhold: New York, NY, USA, 1994. [Google Scholar]
  44. Allmacher, C.; Dudczig, M.; Knopp, S.; Klimant, P. Virtual reality for virtual commissioning of automated guided vehicles. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 838–839. [Google Scholar] [CrossRef]
  45. Aldea, A.; Tinga, A.M.; Van Zeumeren, I.M.; Van Nes, N.; Aschenbrenner, D. Virtual Reality Tool for Human-Machine Interface Evaluation and Development (VRHEAD). In Proceedings of the 2022 IEEE Intelligent Vehicles Symposium (IV), Aachen, Germany, 5–9 June 2022; pp. 151–158. [Google Scholar] [CrossRef]
  46. Malik, A.A.; Masood, T.; Bilberg, A. Virtual reality in manufacturing: Immersive and collaborative artificial-reality in design of human-robot workspace. Int. J. Comput. Integr. Manuf. 2020, 33, 22–37. [Google Scholar] [CrossRef]
  47. Vergara, D.; Rubio, M.P.; Lorenzo, M. On the Design of Virtual Reality Learning Environments in Engineering. Multimodal Technol. Interact. 2017, 1, 11. [Google Scholar] [CrossRef]
  48. Hernández-Chávez, M.; Cortés-Caballero, J.M.; Pérez-Martínez, A.A.; Hernández-Quintanar, L.F.; Roa-Tort, K.; Rivera-Fernández, J.D.; Fabila-Bustos, D.A. Development of Virtual Reality Automotive Lab for Training in Engineering Students. Sustainability 2021, 13, 9776. [Google Scholar] [CrossRef]
  49. Salah, B.; Abidi, M.H.; Mian, S.H.; Krid, M.; Alkhalefah, H.; Abdo, A. Virtual reality-based engineering education to enhance manufacturing sustainability in industry 4.0. Sustainability 2019, 11, 1477. [Google Scholar] [CrossRef]
  50. Lathe Machine-VR Training by Aura Interact. Available online: https://www.youtube.com/watch?v=12ThSZynTI0 (accessed on 16 June 2024).
  51. DC VRT Lathe. Available online: https://www.youtube.com/playlist?list=PLRC00Ly5naBtH50PgqsqePf6cYoJy99iv (accessed on 16 June 2024).
  52. Lathe Safety Simulator VR. Available online: https://www.youtube.com/watch?v=hJ2RBhiDovY (accessed on 16 June 2024).
  53. Train Safely on the Virtual Lathe Machine. Available online: https://eonreality.com/train-safely-virtual-lathe/?lang=es (accessed on 16 June 2024).
  54. Hui-Chin, C. A Novel Training System of Lathe Works on Virtual Operating Platform. J. Softw. Eng. Appl. 2010, 3, 287–302. [Google Scholar] [CrossRef]
  55. Florentia, A.; Riwinoto, R. Usability Analysis: Virtual Reality-Based Lathe Machine Operation Simulation Application. In Proceedings of the 5th International Conference on Applied Engineering, Batam, Indonesia, 5 October 2022. [Google Scholar] [CrossRef]
  56. Hurtienne, J.; Naumann, A. QUESI-A Questionnaire for Measuring the Subjective Consequences of Intuitive Use; Interdisciplinary College: Möhnesee, Germany, 2010. [Google Scholar]
  57. Lessiter, J.; Freeman, J.; Keogh, E.; Davidoff, J. A Cross-Media Presence Questionnaire: The ITC-Sense of Presence Inventory. Presence Teleoperators Virtual Environ. 2001, 10, 282–297. [Google Scholar] [CrossRef]
  58. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Adv. Psychol. 1988, 52, 139–183. [Google Scholar] [CrossRef]
  59. Laugwitz, B.; Held, T.; Schrepp, M. Construction and Evaluation of a User Experience Questionnaire. In Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  60. Brooke, J. SUS: A “Quick and Dirty” Usability Scale. In Usability Evaluation in Industry; CRC Press: Boca Raton, FL, USA, 1996; pp. 207–212. [Google Scholar]
Figure 1. Lathe components.
Figure 1. Lathe components.
Electronics 13 02549 g001
Figure 2. Oculus Rift equipment.
Figure 2. Oculus Rift equipment.
Electronics 13 02549 g002
Figure 3. (a,b) Virtual lathe.
Figure 3. (a,b) Virtual lathe.
Electronics 13 02549 g003
Figure 4. Specimen vending machine.
Figure 4. Specimen vending machine.
Electronics 13 02549 g004
Figure 5. Collider for position detection.
Figure 5. Collider for position detection.
Electronics 13 02549 g005
Figure 6. Spindle.
Figure 6. Spindle.
Electronics 13 02549 g006
Figure 7. (a,b) Clenched claws.
Figure 7. (a,b) Clenched claws.
Electronics 13 02549 g007
Figure 8. Spindle collider for positioning the specimen.
Figure 8. Spindle collider for positioning the specimen.
Electronics 13 02549 g008
Figure 9. (ac) Fixing the centering bit to the tailstock carriage.
Figure 9. (ac) Fixing the centering bit to the tailstock carriage.
Electronics 13 02549 g009
Figure 10. (ac) Displacement of the tailstock carriage.
Figure 10. (ac) Displacement of the tailstock carriage.
Electronics 13 02549 g010
Figure 11. Startup switches and pushbuttons.
Figure 11. Startup switches and pushbuttons.
Electronics 13 02549 g011
Figure 12. (a,b) Emergency stops.
Figure 12. (a,b) Emergency stops.
Electronics 13 02549 g012
Figure 13. Longitudinal and transverse carriage and tool holder.
Figure 13. Longitudinal and transverse carriage and tool holder.
Electronics 13 02549 g013
Figure 14. Cylindrical turning and facing operations.
Figure 14. Cylindrical turning and facing operations.
Electronics 13 02549 g014
Figure 15. Configuration of cylindrical turning and facing operations.
Figure 15. Configuration of cylindrical turning and facing operations.
Electronics 13 02549 g015
Figure 16. Visualization of tables for configuration of turning and facing operations.
Figure 16. Visualization of tables for configuration of turning and facing operations.
Electronics 13 02549 g016
Figure 17. Fundamentals of machining operations.
Figure 17. Fundamentals of machining operations.
Electronics 13 02549 g017
Figure 18. Machining practice performed by students on the virtual lathe.
Figure 18. Machining practice performed by students on the virtual lathe.
Electronics 13 02549 g018
Figure 19. Working with the lathe in virtual reality. Projection for the students in the lab.
Figure 19. Working with the lathe in virtual reality. Projection for the students in the lab.
Electronics 13 02549 g019
Figure 20. Violin plots for questions with SD > 0. The black square represents the mean value.
Figure 20. Violin plots for questions with SD > 0. The black square represents the mean value.
Electronics 13 02549 g020
Figure 21. Violin plots for SUS questions with SD > 0 and SUS score. The black square represents the mean value.
Figure 21. Violin plots for SUS questions with SD > 0 and SUS score. The black square represents the mean value.
Electronics 13 02549 g021
Table 1. Questions used for the assessment of the user experience.
Table 1. Questions used for the assessment of the user experience.
Q1: The experience has caused me dizziness or other discomfort.Q7: The lathe works like the real thing.
Q2: I felt the sensation of being present at the scene and that events were really happening.Q8: I have perceived risk of some kind.
Q3: I felt like I was in a real facility.Q9: I’ve found it useful to be part of the virtual scene.
Q4: I have felt the objects as real in the virtual environment.Q10: The app is usable and user-friendly.
Q5: The colors and textures are similar to the real thing.Q11: It is easier to operate the lathe than the virtual version.
Q6: Lighting effects are appropriate.Q12: I consider the application to be valid for training and testing.
Table 2. Questions used in the usability assessment (SUS).
Table 2. Questions used in the usability assessment (SUS).
SUS-Q1: I would use this tool frequently.SUS-Q6: The tool is inconsistent.
SUS-Q2: I find this tool unnecessarily complex.SUS-Q7: Most people could learn how to use this tool very quickly.
SUS-Q3: The tool was easy to use.SUS-Q8: The tool is very difficult to use.
SUS-Q4: I would need the help of somebody with technical knowledge of this tool.SUS-Q9: I feel confident using this tool.
SUS-Q5: The tool’s functionality is well integrated.SUS-Q10: I had to learn many things before I could use this tool.
Table 3. Results user experience (1 = strongly disagree, 5 = strongly agree).
Table 3. Results user experience (1 = strongly disagree, 5 = strongly agree).
Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10Q11Q12
Mean1.155.005.005.005.004.855.001.155.004.754.054.75
Median1.005.005.005.005.005.005.001.005.005.004.005.00
Mode1.005.005.005.005.005.005.001.005.005.004.005.00
Std. Dev.0.490.000.000.000.000.490.000.490.000.790.830.79
Table 4. Results of usability assessment (1 = strongly disagree, 5 = strongly agree).
Table 4. Results of usability assessment (1 = strongly disagree, 5 = strongly agree).
MeanMedianModeSDMinimumMaximum
SUS-Q14.505.005.001.1015
SUS-Q21.001.001.000.0011
SUS-Q34.705.005.000.6635
SUS-Q41.101.001.000.3112
SUS-Q54.905.005.000.3145
SUS-Q61.001.001.000.0011
SUS-Q74.905.005.000.3145
SUS-Q81.001.001.000.0011
SUS-Q94.605.005.000.6835
SUS-Q101.001.001.000.0011
SUS96.25100.00100.006.4180.0100
Table 5. Spearman’s rank-order correlations between SUS score and questions used for the assessment of the user experience with SD > 0.
Table 5. Spearman’s rank-order correlations between SUS score and questions used for the assessment of the user experience with SD > 0.
Q1Q6Q8Q10Q11Q12
SUSSpearman’s rho−0.580.58−0.570.580.350.58
df181818181818
p-value0.0080.0080.0080.0080.1310.008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Conesa, J.; Martínez, A.; Mula, F.; Contero, M. Learning by Doing in VR: A User-Centric Evaluation of Lathe Operation Training. Electronics 2024, 13, 2549. https://doi.org/10.3390/electronics13132549

AMA Style

Conesa J, Martínez A, Mula F, Contero M. Learning by Doing in VR: A User-Centric Evaluation of Lathe Operation Training. Electronics. 2024; 13(13):2549. https://doi.org/10.3390/electronics13132549

Chicago/Turabian Style

Conesa, Julian, Antonio Martínez, Francisco Mula, and Manuel Contero. 2024. "Learning by Doing in VR: A User-Centric Evaluation of Lathe Operation Training" Electronics 13, no. 13: 2549. https://doi.org/10.3390/electronics13132549

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop