Next Article in Journal
Debris Flow Characteristics in Flume Experiments Considering Berm Installation
Next Article in Special Issue
Human–Computer Interaction Based on Scan-to-BIM Models, Digital Photogrammetry, Visual Programming Language and eXtended Reality (XR)
Previous Article in Journal
An Algorithm for Rescheduling of Trains under Planned Track Closures
Previous Article in Special Issue
Tangible VR Book: Exploring the Design Space of Marker-Based Tangible Interfaces for Virtual Reality
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Mixed Reality Solution Based on Learning Environment for Sutures in Minor Surgery

1
Centro Universitario de Tecnología y Arte Digital (U-tad), Las Rozas, 28290 Madrid, Spain
2
Department of Computer Science & Statistics, Universidad Rey Juan Carlos, Mostoles, 28933 Madrid, Spain
3
Research Center for Computational Simulation, Scientific and Technological Park, Boadilla del Monte, 28660 Madrid, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(5), 2335; https://doi.org/10.3390/app11052335
Submission received: 14 February 2021 / Revised: 27 February 2021 / Accepted: 1 March 2021 / Published: 6 March 2021
(This article belongs to the Special Issue Innovative Solutions for Augmented and Virtual Reality Applications)

Abstract

:
Minor Surgery Sutures is a fundamental skill for healthcare professionals. However, in the educational field, the practice of suturing is sometimes limited and reduced, with more theoretical than practical study. In order to facilitate learning, our goal is to develop an immersive and interactive educational tool that complements theoretical study, called Suture MR. This application could enhance suture procedural skills in the fields of nursing and medicine. Applying Mixed Reality techniques, we generate a 3D model of an arm with a full-scale wound. Realistically, the user will simulate the suture movements as part of the learning process. The application has surgical clamps and a needle holder that are virtually visualized in the user’s hands, allowing gestures and movements faithful to the real ones. In this article, we want to demonstrate the usability of our environment and the feasibility of using Mixed Reality learning experiences in clinical practical training as a complement to theoretical training. The results of the study reveal a greater perception of learning and the willingness of students to use this methodology.

1. Introduction

The union between theory and practice in clinical training is generally one of the main concerns for educational institutions. Medicine schools recognize dissonances between what students learned in college and what they learned in the assistance centers during their clinical practices and residencies [1].
In the educational context, the use of real practical scenarios in its educational methodology is not affordable, neither the dynamism of the clinic, nor the achievable experience on technological advances incorporated into clinical practice. Therefore, different efforts have been made to reduce the differences between both realities. Commonly alternatives are (i) mannequins, which are expensive; (ii) the use of animals, whose case has legal and ethical restrictions; and, (iii) in some cases, the collaboration of colleagues for recreation of practical cases. The biggest disadvantage of these alternatives is the inability to repeat the task in countless times, the limitation of customization of the intervention, and the discomfort of evaluating the student [2].
The emergence of immersive technologies led clinicians to the use of Virtual Reality (VR) environments, which allow a progressive training at the level of complexity and the possibility of repeating the exercise until assimilating the technique [3]. These advantages offer the ability to define a valuable personalized educational experience [4,5]. Augmented Reality (AR) and Mixed Reality (MR) technologies allow virtual objects to be overlaid in the user’s real environment [6]. The main difference between them is that MR improves the recognition of the geometrical surfaces in the real world, allowing a much more realistic integration of virtual content. Immersive technologies stimulate an emotional adherence in the subject, leading to a motivating learning process. Though these technologies allow students to be placed in highly controlled situations without consequences in real life, AR/MR makes it possible to know the results due to poor decision-making by learners [7].
In addition, the ease of access to current technologies make extended realities an increasingly everyday tool, creating urgency in the field of education to take advantage of their pedagogical benefits. Immersive technologies become potential alternatives in the generation of clinical cases or simulators in the educational training.
To target this dual academic-clinical scenario, we propose an MR approach built on the combination of gesture detection and hand-tracking to implement an interactive learning. This activity sets up the motor coordination of the users favoring their involvement and motivation for learning of sutures techniques and basic suture materials. We have developed this solution as an MR application on Magic Leap HMD. We tested the user experience and retention potential of the MR learning environment approach through an experimental group with 32 subjects.
Indeed, AR has facilitated medical surgical planning and guidance, due to its potential for including relevant clinical data in the sight of the clinicians [8]. The implementation of augmented platforms for the visualization of medical images on video capture has been applied both to preoperative planning of orthopedic surgery and to guidance systems [9,10].
MR training applications have been developed for clinicians to improve their surgical skills and understand the spatial relationships and concepts. A case study is the Turini’s orthopedic open surgery simulation with Hololens technology [11]. Its conclusions suggest the suitability of MR technology for this application to display relevant anatomical views. There are currently about 30 medical simulators for educational purposes intended specifically for training in laparoscopic surgery, neurosurgery procedures, and echocardiography according to Barsom’s review [12]. The advantage of MR and AR trainers is the ability to combine a physical simulation with a virtual reality overlay, creating more realistic situations [13].
In some research, training in surgery or sutures has been based on the development of virtual simulators with haptic technology designed for the practice of these techniques in hospitals [14]. However, most of them focus on laparoscopy or arthroscopy techniques, and the high cost of kinaesthetic technology means that its use has not been widespread. Taking these models as references of the efficiency of the simulation tools in the field of suture training, we propose a more affordable tool, resulting in a portable and easy to configure solution, to address the learning of suture technique without the use of haptic hardware.

2. Materials and Methods

The present work addresses two fundamental guidelines: (i) the design and development of a technological solution in MR and (ii) validation of its feasibility as a learning tool.

2.1. Fundamentals on Minor Surgery Sutures

Suturing is a surgical procedure to join tissues that have been sectioned by a wound or surgical incision. Though, eventually, the action of closing a wound is the last phase of any surgical technique [15]. In the ligation or approximation of tissues, threads are used to support and help in the approximation of the edges of a wound. The stitches distribute the tensile force longitudinally through the wound until the natural process of healing is satisfactorily established to prevent it from opening. This practice involves the implantation of synthetic materials in human tissue; therefore, it is advisable to know the characteristics of the materials used and the appropriate suture techniques for each case.
Determining factors for an adequate intervention are suture procedures, materials (threads), and types of needles. For this reason, we propose a training tool to endorse the learning of suturing process by offering the practice of the four basic patterns and the study of the different types of needles according to its size, section, and type of point.

2.2. Mixed Reality Application as Learning Environment

Nowadays, learning methodologies are focused on stimulating, motivating, and promoting the emotional and experiential involvement of the apprentice in the learning process. This increases the possibilities of knowledge retention. According to Zapata et al., the greater the individual’s emotion in a stimulus generated for short-term memory, the greater the possibility of successfully accessing the brain’s chemical conversion systems to transform it into long-term memory [16]. The correct application of learning strategies is a fundamental role in the retention of knowledge and the exercise of memory [17]. The design of Suture MR depends on certain requirements that our learning ecosystem must meet:
  • Ubiquity: offering a practicable system at any time and place, without dependencies.
  • Creation of the experiment: increases the impact on their ability to remember information associated with decision-making.
  • Solving a case study: to promote their analytical capacities.
  • Promoting movement coordination: the coordination of the limbs movement favors the perception of spatial contents and potentiates the mobilization of cerebral structures different from those developed by reading and writing.
  • Repetition: subdividing a large target into smaller tasks and inducing the repetition of these tasks contributes to the short-term memory as repetitive learning.

2.3. Suture MR Workflow

We developed our graphical software application in Unity version 2018.1.9f2-MLP using C#. Our solution is clearly divided into three phases as shown in the User Experience (UX) workflow diagram (see Figure 1), where students will be learning practical suture skills while the experience progresses (see Supplementary Materials).
The first phase consists in the creation of the educational practice case, which allows the desired needle to be related to the selected suture technique. The four techniques developed in Suture MR have been selected according to the criterion of fundamental knowledge. Including more complex techniques is a simple task. The needle types incorporated in our tool follow the commercial standardization by Serag-Wiessner manufacturer [18]. In this first phase, the user selects, through interaction with the panels, the type of procedure to be practiced and the needle configuration, as shown in the red modules of Figure 1.
Once phase I is completed, a tutorial explains how to use how to use the tools for the selected procedure and how to start the interactive activity. After this tutorial, phase II begins: selection of the workspace. In Mixed Reality, the AR system must comply with the detection of volumes of the 3D elements of the physical space by means of scanning techniques based on point cloud contour detection. This allows the subsequent reconstruction of a digital mesh of the space. From this information, the augmented reality system can calculate the occlusions of the real objects with the digital ones and also position the augmented objects on the surfaces detected in the scan. Usually, this function is provided by the development tool kits of each vendor. For our application, we have used the Magic Leap SDK to implement this specific module (phase II) that shows the user the result of the scanning phase. To allow the user to select the workspace based on whether the observed area is suitable, the classification of each fragment of this scanned surface is calculated internally and labeled as vertical or horizontal, depending on its normal vector. When the user points to a surface with the cursor, the green pointer indicates that the area is suitable for positioning the 3D arm; otherwise, the pointer lights up red. The green–red feedback criterion is based on identifying whether the fragment the user is pointing at, and its neighboring fragments, are horizontal surfaces and the sum of their areas is greater than the area occupied by the 3D arm. As a result, the user visualize, through the Mixed Reality glasses, a realistic 3D arm model on his desk from the real world (see Figure 2).
At this point, phase III starts and two interactive objects are projected onto the user’s hands: the tweezers and the needle holder. In this last phase, the tool must enable realistic user interaction with the virtual arm. To this end, we promote a motor coordination movement based on the activation of the trigger points to complete the suture procedure (see Figure 3).

2.4. Gesture Detection

For the automatic detection algorithm of user gestures and hand tracking, it is necessary to access the Hand Tracking Controller of Magic Leap [19]. This provides Suture MR with two main functions: gesture detection and tools orientation. The first is the detection of the pose “pinch”. The gesture detection method enables the movement and positioning of an object. Consequently, we allow the user to interact through gestures (see Figure 4). Therefore, the methodology of this activity requires the use of gestures as triggers. For correct gesture recognition it is necessary to set the Pose Filter Level, Key Pose Confidence Value, and Key Point Filter Level parameters.
The Pose Filter Level parameter was set to Raw, as it is preferable to smoothly follow the pose and reduce the refresh rate, as rapid hand movements are not expected. The Key Pose Confidence Value is set to 0.85, as we expect the subject to make the pinch gesture as close to the reference as possible. With this restriction, we indicate that the system allows the movement of the virtual tools as long as the user performs the gesture with precision.
In the case of the Key Point Filter Level parameter, the key points of the hand correspond to the joints. With these points we can reconstruct the current skeleton of the hand detecting user’s gesture. Key points detection and gesture classification are done each frame, setting an event of pose detection and its confidence level. When the detected confidence level is lower than the fixed threshold, our system starts a deactivation time, which extends over eight frames. In the case that a new event indicates that the confidence level exceeds the predefined threshold before the deactivation time expiration, this process is stopped maintaining the gesture. Otherwise, the initial gesture would not be recognized, and neither would its functionality respond. We analyzed that a softer or stricter filter level affects the gesture on and off detection times. Therefore, we selected an extra soft filter option so that small variations around the threshold in the gesture confidence level do not make the use of the tool difficult [20].

2.5. Hand Tracking and Tools Orientation

The needle holder and tweezers are initially arranged in a certain orientation to control the tool according to the orientation of the detected gesture with both hands. They are placed in the center of the corresponding hand, namely, the Wrist/KeyPoint Hand Center (as shown in Figure 4), placing the dissecting tweezers on the left and the needle holder on the right (see Figure 5). Thus, the transformation corresponding to the relative rotation of the fingertips with respect to the wrist can be obtained from the vector director with origin in the wrist and terminal point in tip of the thumb. This direction indicates where the surgery tools should be pointing at. Algorithm 1 displays the pseudocode performed on each frame of the game loop, with a rendering rate of 80–90 frames per second. We set a quaternion to change the rotation of the object. The quaternion creates a rotation with a target vector aligned with the Z axis, the cross product between target vector and upward direction aligned in Xaxis, and the cross product between the Z and X aligned in Y axis. The approach used for the vector rotation is to represent a rotation by a quaternion unit of length q = ( ω , r ) with scalar (real) part ω and vector (imaginary) part r . The rotation is applied to a 3D vector v through the formula
v r = v + 2 r x ( r x v + ω v ) v r = q v q 1
Algorithm 1: Position orientation right hand-tracking algorithm of augmented surgery tools.
Applsci 11 02335 i001

2.6. Experiment

Once we have designed and implemented the educational application Suture MR, the goal of our validation pilot study is to (i) verify the usefulness of the Suture MR application on a Mixed Reality device such as Magic Leap, due to the possible complexity of this new technology; (ii) evaluate its potential in academic use by collecting information on users’ perception of learning; and (iii) evaluate the method of application of this tool as a learning material. That is, we aim to analyze whether the use of this active learning tool hinders the learning process of health students and facilitates the practice of this intervention.

2.7. Usability Metric

The SUS scale questionnaire is a standard scale assessment of usability of technological systems. It is easy to use and understand for the users. It is a 5-point Likert style scale (where 1 = strongly disagree, 5 = strongly agree) that generates a single number that represents a measure of the usability of the system under study. The final score is between 0 to 100 points, meaning the lack or best usability correspondingly [21]. The test is composed of 10 items whose value ranges between 0 and 4. For the final result, the value obtained from each item is added, and this result is multiplied by 2.5, thus obtaining the overall value of SUS. Note that the scores for each of the 10 independent issues are not significant in themselves.

2.8. Procedures

The experiment is composed of three phases. The first and second are experimental sessions of both learning methodologies: (i) Text-Reading and (ii) Mixed Reality practice. Text-Reading is a session to acquire knowledge about suture techniques by reading a descriptive text (a text adequately supervised by a health specialist). Mixed Reality practice is a session to acquire knowledge of a suture technique using our Mixed Reality application. Each participant reviews a specified part of theoretical content in each session. Then, as the third phase, the participants should complete a usability test of our tool, and a questionnaire about their subjective perceptions in relation to both learning forms.
The experiment has been carried out under the same conditions for all users, avoiding the influence of invariant factors such as time, environment, device, etc. All participants were in the same physical space for the tests with the same lighting and both sessions lasted the same time (see Figure 2). Methodologically, the only difference between the sessions lies in the applied learning resource. This approach identifies as dependent variables the methodologies of each session (Mixed Reality Learning tool and Text-Reading) and as independent variable the results of the tests obtained after each session.
Specifically, the subjects performed:
  • Active Learning with MR Experience. The user initiated Suture MR, configured the experience, and carried out the interactive activity. This action was timed up on 10 min. Once it was finished, the participant took the retentive test.
  • Traditional Learning based on Text-Reading. The participant read the text for 10 min. After that he/she took the retentive test.
  • Evaluation. Participants complete questionnaires about their personal profile, their subjective assessment of the Suture MR, and the usability questionnaire using the SUS scale questionnaire.
Each participant in the group was assigned one of the four suture procedures for each action and a specific needle configuration in phases I and II. Therefore, the permutation of four different procedures to be performed in two tests generates 16 different combinations for each group (of 16 participants). To sum up, under these conditions, each participant has studied different procedures between the Text-Reading and Mixed Reality sessions, and half of the participants have swapped the order of the sessions (i.e., first Mixed Reality, then Text-Reading).
Regarding the evaluation system, all tests were composed of four multiple choice questions where hits add 1 point, errors subtract 0.5 points, and blank answers do not score. Thus, scores ranged from 0 to 4 points.

2.9. Participants

The present study enrolled a total of 32 participants aged 18 to 30 years (18 women and 14 men) and 50% of them have knowledge of health sciences. The cohort was divided into two groups balanced in men and women, and with health field knowledge/without health field knowledge, in which the order of the sessions was exchanged. Additionally, it is relevant to mention the profile analyzing of those surveyed reveals that 46.9% of the participants had their first contact with extended reality experiences, and the remaining 53.1% have eventually experienced these technologies.

3. Results

This section details the results obtained in our experiment. First of all, we present results about Suture MR usability, and the results of subjective evaluations of the system. We have also assessed whether there are significant differences in learning outcomes due to the use of the Mixed Reality application before or after reading the text. To carry out this statistical analysis we used the R Studio software.

3.1. Suture MR Usability Validation

The mean value of Suture MR usability, calculated with SUS was 72.34 (SD = 10.35), as showed with a yellow line in Figure 6. To interpret the SUS score, it must be converted to a percentile rank. Based on the distribution of all scores, a raw SUS score of 72.34 converts to a percentile rank of 70%, which objectively means that the evaluated system has a higher perceived usability than 70% of all products tested. In addition, previous researches pointed out as guideline of SUS score interpretation that a score above 68 would be considered acceptable [22]. We can conclude that Suture MR has been accepted, and its ease of use has been verified. In this way, doing the practical learning with our tool is usable for the users, not hindering the learning by the technological innovation.

3.2. Users Subjective Evaluation of the Mixed Reality

In order to detect the user’s opinion about learning with both methodologies, we elaborated a short questionnaire with the following three Likert scale questions and one multiple choice question:
  • Do you consider that your attention span during the use of Suture MR has been compromised by the use of new learning technology (Mixed Reality)? Please check whether 1 = Strongly Disagree, 5 = Strongly Agree.
  • Would you recommend the use of this tool, based on Mixed Reality, as a learning method? Please check whether 1 = Strongly Disagree, 5 = Strongly Agree.
  • To continue learning, would you recommend Suture MR among the two learning methodologies? Please check whether 1 = Strongly Disagree, 5 = Strongly Agree.
  • Choose the option you agree with: “I believe the learning has been greater after using...” (A) Text-Reading Methodology; (B) Mixed Reality Tool (Suture MR); (C) Both methodologies are comparable.
Regarding how the use of new technologies can affect attention, the survey results indicate that 10 of the 32 participants noticed limitations in their attention span caused by the need to adapt to technology, 10 were neutral, and the remaining 12 considered that their attention was slightly or not affected at all by the technology. As for the overall assessment of our learning tool, 93.3% agreed or strongly agreed to recommend its use for learning (see Figure 7). Note that 48% of those surveyed consider that their learning has been greater through using Suture MR, 38.7% consider that both methodologies are comparable, and the remaining 12.9% indicate that they have learned more by following the traditional methodology (see Figure 8).

3.3. Descriptive Statistical Analysis of Order of Learning Sessions Influence on Learning Tests Scores

We assessed whether the scores obtained are influenced by the order in which the test is taken: first Text-Reading session and then interactive learning session. This analysis aims to contrast whether the sessions order may have affected the test results obtained. We first applied the Shapiro–Wilk normality test to verify the independence of means for each population. In this case, only one of the four statistical tests carried out obtained a p ≥ 0.05 (see Table 1), though we generally assume the samples do not follow a normal distribution. Then, we carried out a Wilcoxon Rank Sum Test to contrast independence between population medians. As it is shown in Table 2 p-values are less than 0.05, which implies to assume the hypothesis of dependence in between populations. This conclusion leads us to confirm that the order in which the learning sessions have been carried out influences the results obtained.

3.4. General Analysis of Learning Tests Outcomes

We analyze which is the most beneficial combination by observing the total average test results.
The median values in the statistics table (see Table 3) indicate that users show better test results (see retention evaluation procedure in Section 2.8) after the second session of learning ( M e = 3.250) rather after the first session ( M e = 2.500), regardless of which was first methodology applied. As for the order of the learning process, the most recommendable order according to the results obtained would be the one carried out by group 1: first the reading of the text and then the interactive experience. Following this order, the user initially obtains certain theoretical knowledge during the first session with the text and, in the interactive session with Mixed Reality, can internalize this knowledge due to the application of active learning.

4. Discussion

In this article, we provide a learning tool based on an active methodology developed for Mixed Reality technology. We used a next-generation device (Magic Leap) to take advantage of the advances that today’s technology can offer in the field of education. The development of this type of solutions in this technology is notably scarce.
Usability results, subjective users opinion questionnaires results, and recommendations of use Suture MR as part of learning methodology are briefly summarized as follows:
  • Users valued the usability of the Mixed Reality solution at 72.34/100 points. Such a satisfactory level of usability included in the third quartile indicates that no adaptation of our approach would be necessary.
  • The user’s subjective perception questionnaire results reveal that users find that the use of Suture MR encourages their learning process to a greater extent than reading the text alone would do.
  • The analysis of test scores suggests that the combination of a practical experience with the Mixed Reality solution after the reading of a theoretical knowledge improves the retention results of the subject.
In short, Suture MR is presented as an useful, usable, ubiquitous, and motivating system that makes an interactive proposal for the teaching of academic content specialized in basic sutures of minor surgery.
Considering that active learning has the capacity to emotionally link the user and strengthen the process of internalization of information, Suture MR meets these objectives, whose would be difficult to satisfy by reading texts. That is why users perceive the use of Suture MR such effective in learning terms.
Despite the obtained positive results, several issues could be improved considered after the trials. Indeed, this study can be understood as a starting point for future works. Further future research lines to be followed could be to expand and differentiate the contents shown for each suture technique, to incorporate new fixed informative panels that reinforce the theoretical knowledge, to integrate a user guidance system in the realization of the suture stitches, or to conduct new studies aimed at medical and nursing students with greater experience in the use of extended realities. Furthermore, it would be interesting to evaluate long-term memory, to assess the potential of the learning tool as an experiential learning ecosystem.

Supplementary Materials

The following are available online at https://www.mdpi.com/2076-3417/11/5/2335/s1, Video S1: Design, development and evaluation of a mixed reality solution based on learning environment for sutures in minor surgery.

Author Contributions

Conceptualization and Methodology, A.R. and L.R.; software, formal analysis, and writing—original draft preparation A.R.; resources and writing—review and editing, L.R. and A.S.; validation and investigation, A.R., L.R. and A.S.; funding acquisition, A.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Spanish Ministry of Science, Innovation and Universities grant number RTI2018-098694-B-I00.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Weller, J.M. Simulation in undergraduate medical education: Bringing the gap between theory and practice. Med. Educ. 2004, 38, 32–38. [Google Scholar] [CrossRef] [PubMed]
  2. Naylor, R.A.; Hollett, L.A.; Valentine, R.J.; Mitchell, I.C.; Bowling, M.W.; Ma, A.M.; Dineen, S.P.; Bruns, B.R.; Scott, D.J. Can medical students achieve skills proficiency through simulation training? Am. J. Surg. 2009, 198, 277–282. [Google Scholar] [CrossRef] [PubMed]
  3. Zhu, E.; Hadadgar, A.; Masiello, I.; Zary, N. Augmented reality in healthcare education: An integrative review. PeerJ 2014, 469. [Google Scholar] [CrossRef] [Green Version]
  4. Wojciechowski, R.; Cellary, W. Evaluation of learners’ attitude toward learning in ARIES augmented reality environments. Comput. Educ. 2013, 68, 570–585. [Google Scholar] [CrossRef]
  5. Di Serio, A.; Ibáñez, B.M.; Kloos, D.C. Impact of Aumented reality system on student’s motivation for a visual art course. Comput. Educ. 2013, 68, 586–596. [Google Scholar] [CrossRef] [Green Version]
  6. Hanna, M.G.; Ahmed, I.; Nine, J.; Prajapati, S.; Pantanowitz, L. Augmented Reality Technology Using Microsoft HoloLens in Anatomic Pahology. Arch. Pathol. Lab. Med. 2018, 142, 638–644. [Google Scholar] [CrossRef] [Green Version]
  7. Albrecht, U.-V.; Folta-Schoofs, K.; Behrends, M.; von Jan, U. Effects of Mobile Augmented Reality Learning Compared to Textbook Learning on Medical Students: Randomized Controlled Pilot Study. J. Med. Internet Res. 2018, e182. [Google Scholar] [CrossRef] [PubMed]
  8. Tepper, O.M.; Rudy, H.L.; Lefkowitz, A.; Weimer, K.A.; Marks, S.M.; Stern, C.S.; Garfein, E.S. Mixed Reality with HoloLens: Where virtual reality meets augmented reality in the operating room. Plast. Reconstr. Surg. 2017, 140, 1066–1070. [Google Scholar] [CrossRef] [PubMed]
  9. Wu, X.; Liu, R.; Yu, J.; Xu, S.; Yang, C.; Yang, S.; Shao, Z.; Ye, Z. Mixed Reality Technology Launches in Orthopedic Surgery for Comprenhensive Preoperative Management of Complicated Cervical Fractures. Surg. Innov. 2018, 25, 421–422. [Google Scholar] [CrossRef] [PubMed]
  10. Li, Y.; Chen, X.; Wang, N.; Zhang, W.; Li, D.; Zhang, L.; Qu, X.; Cheng, W.; Xu, Y.; Chen, W.; et al. A wearable mixed-reality holografic computer for guiding external ventricular drain insertion at the bedside. JNS J. NeuroSurg. 2018, 1–8. [Google Scholar] [CrossRef] [Green Version]
  11. Turini, G.; Condino, S.; Parchi, P.D.; Viglialoro, R.M.; Piolanti, N.; Gesi, M.; Ferrari, M.; Ferrari, V. A Microsoft Hololens Mixed Reality Surgical Simulator for Patient-Specific Hip Arthroplasty Training; Springer Nature: Basingstoke, UK, 2018; pp. 201–210. [Google Scholar]
  12. Barsom, E.Z.; Graafland, M.; Schijven, M.P. Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 2016, 30, 4174–4183. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Kobayashi, L.; Zhang, X.C.; Collins, S.A.; Karim, N.; Merck, D.L. Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training. West. J. Emerg. Med. 2018, 19, 9158–9164. [Google Scholar] [CrossRef]
  14. Choi, K.; Chan, S.; Pang, W. Virtual Suturing Simulation Based on Commodity Physics Engine for Medical Learning. J. Med. Syst. 2012, 36, 1781–1793. [Google Scholar] [CrossRef] [PubMed]
  15. Blanco, J.M.A.; Fortet, J.R.C.; Pata, N.R.; Olaso, A.S.; Guztke, M.M. Suturas Básicas y avanzadas en cirugía menor (III). Med. Fam. SEMERGEN 2002, 28, 89–100. [Google Scholar] [CrossRef]
  16. Zapata, L.F.; Reyes, C.D.L.; Lewis, S.; Barceló, E. Memoria de trabajo y rendimiento académico en estudiantes de primer semestre de una universidad de la ciudad de Barranquilla. Psicol. Caribe 2002, 28, 66–82. [Google Scholar]
  17. Cabero, J.; Barroso, J. Posibilidades educativas de la Realidad Aumentada. J. New Approaches Educ. Res. 2016, 5, 44–50. [Google Scholar] [CrossRef] [Green Version]
  18. Serag-Wiessner. Surgical Needles. Available online: https://www.serag-wiessner.de/en/products/surgical-needles/ (accessed on 24 February 2021).
  19. Hand Tracking Key Points in Unity. Available online: https://developer.magicleap.com/learn/guides/hand-tracking-key-points-unity (accessed on 18 June 2019).
  20. Hand Poses in Unity. Available online: https://developer.magicleap.com/learn/guides/unity-sdk-0-22-gestures-in-unity (accessed on 25 April 2019).
  21. Brooke, J. SUS: A quick and dirty usability scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B.A., Weerdmeester, B., McClelland, I.L., Eds.; Publishing House Taylor & Francis: London, UK, 1996; pp. 189–194. [Google Scholar]
  22. Measuring Usability with the System Usability Scale (SUS). Available online: https://measuringu.com/sus/ (accessed on 16 July 2019).
Figure 1. Suture MR UX workflow. The diagram shows user experience design of Suture MR detailing the selection content of each phase. Phase I (red color) enables needle selection, allowing different needle configurations (indicated by colored arrows), and finally, suture procedure selection; phase II (green color) enables set the workspace; and phase III (yellow color) consists of practicing the interactive activity.
Figure 1. Suture MR UX workflow. The diagram shows user experience design of Suture MR detailing the selection content of each phase. Phase I (red color) enables needle selection, allowing different needle configurations (indicated by colored arrows), and finally, suture procedure selection; phase II (green color) enables set the workspace; and phase III (yellow color) consists of practicing the interactive activity.
Applsci 11 02335 g001
Figure 2. Participant testing Suture MR in Magic Leap device. The left image shows the augmented environment displayed on Magic Leap. The right image shows user’s performance during the interactive activity.
Figure 2. Participant testing Suture MR in Magic Leap device. The left image shows the augmented environment displayed on Magic Leap. The right image shows user’s performance during the interactive activity.
Applsci 11 02335 g002
Figure 3. Interactive task schema for simple interrupted suture. The first trigger point (blue color) is activated by the insertion of the needle holder. The thread path is shown in purple. The second trigger point (yellow color) is activated by the insertion of the tweezers.
Figure 3. Interactive task schema for simple interrupted suture. The first trigger point (blue color) is activated by the insertion of the needle holder. The thread path is shown in purple. The second trigger point (yellow color) is activated by the insertion of the tweezers.
Applsci 11 02335 g003
Figure 4. Magic Leap hand key points for gesture detection [19].
Figure 4. Magic Leap hand key points for gesture detection [19].
Applsci 11 02335 g004
Figure 5. Interaction activity on Magic Leap handling tweezers and needle holder. The left image shows the tweezer’s interaction with the first interactive point of a stitch. The center image depicts the enabling of the following stitch in the wound. The right image illustrates the needle-holder’s interaction with the second interactive point of a stitch.
Figure 5. Interaction activity on Magic Leap handling tweezers and needle holder. The left image shows the tweezer’s interaction with the first interactive point of a stitch. The center image depicts the enabling of the following stitch in the wound. The right image illustrates the needle-holder’s interaction with the second interactive point of a stitch.
Applsci 11 02335 g005
Figure 6. Graphic representation of SUS score.
Figure 6. Graphic representation of SUS score.
Applsci 11 02335 g006
Figure 7. Results of Question 2 of Users Subjective Evaluation Questionnaire: users recommendation of Suture MR Tool usage for learning.
Figure 7. Results of Question 2 of Users Subjective Evaluation Questionnaire: users recommendation of Suture MR Tool usage for learning.
Applsci 11 02335 g007
Figure 8. Results of Question 4 of Users Subjective Evaluation Questionnaire: users subjective appreciation of greater learning for both methodologies.
Figure 8. Results of Question 4 of Users Subjective Evaluation Questionnaire: users subjective appreciation of greater learning for both methodologies.
Applsci 11 02335 g008
Table 1. Results (statistic W and p-value) of retentive score on different Learning Sessions order obtained throughout the Shapiro–Wilk Normality Test.
Table 1. Results (statistic W and p-value) of retentive score on different Learning Sessions order obtained throughout the Shapiro–Wilk Normality Test.
Test/GroupOrder: Text-Suture MROrder: Suture MR-Text
Mixed RealityW = 0.8689W = 0.7675
p = 0.0262p = 0.0010
Text-ReadingW = 0.7586W = 0.7876
p = 0.0008p = 0.0019
Table 2. Results (statistic W and p-value) of retentive score on different sessions order obtained throughout the Wilcoxon Sum Rank Test.
Table 2. Results (statistic W and p-value) of retentive score on different sessions order obtained throughout the Wilcoxon Sum Rank Test.
Test/GroupOrder of Sessions
Mixed RealityW = 0.95
p = 0.1936
Text-ReadingW = 134
p = 0.82
Table 3. Statistics (median, mean, and standard deviation) of retentive score on different methods by groups of participants.
Table 3. Statistics (median, mean, and standard deviation) of retentive score on different methods by groups of participants.
SessionsMedianMeanVariance
Group 1
(Sessions Order)
Mixed Reality2.5002.3131.391
Text-Reading3.2502.9691.268
Group 2
(Sessions Order)
Text-Reading2.5002.9691.023
Mixed Reality3.2502.9691.152
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Rojo, A.; Raya, L.; Sanchez, A. A Novel Mixed Reality Solution Based on Learning Environment for Sutures in Minor Surgery. Appl. Sci. 2021, 11, 2335. https://doi.org/10.3390/app11052335

AMA Style

Rojo A, Raya L, Sanchez A. A Novel Mixed Reality Solution Based on Learning Environment for Sutures in Minor Surgery. Applied Sciences. 2021; 11(5):2335. https://doi.org/10.3390/app11052335

Chicago/Turabian Style

Rojo, Ana, Laura Raya, and Alberto Sanchez. 2021. "A Novel Mixed Reality Solution Based on Learning Environment for Sutures in Minor Surgery" Applied Sciences 11, no. 5: 2335. https://doi.org/10.3390/app11052335

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop