Integration of Virtual Reality in the Control System of an Innovative Medical Robot for Single-Incision Laparoscopic Surgery
Abstract
:1. Introduction
2. Literature Review
3. Materials and Methods
3.1. The Slave Robotic System
- robot rigid frame;
- operating table;
- kinematic chain 1;
- kinematic chain 2;
- kinematic chain 3;
- instrument orientation module 1;
- active instrument 1;
- instrument orientation module 2;
- active instrument 2;
- endoscopic camera.
3.2. Singularity Analysis and Workspace of Parallel Robotic Structure
- Zero equality;
- Very small values of the evaluation result, which could suggest proximity to a singularity zone;
- Changes in the sign of the equation from one value to another, which could illustrate crossing through zero.
- Case 1. Based on the graphical representation of the values (computed with respect to the variation of the linear coordinates of the TCP), no points were identified in the robot workspace where the value of the factor was zero or close to zero. The range of variation was as follows:
- Case 2. Based on the graphical representation of the values illustrated in Figure 4 (computed with respect to the variation of the linear coordinates of the TCP), no points were identified in the robot’s workspace where the value of the factor was zero or close to zero. The range of variation was:
- The total workspace of the robot with respect to the required orientation angles for the endoscopic camera, namely:.
3.3. The Master Console
3.3.1. The Multi-Modal Master Control Architecture
3.3.2. Software Application Development
C# Software Analysis
- Eleven use cases that represent the functionalities of the C# software application.
- Three actors:
- ▪
- The user or the external entity that interacts with the C# application.
- ▪
- Script implemented in Arduino.
- ▪
- Unity virtual reality application.
- Relations between the user and the use cases and relations between the use cases.
- GUI class: enables the user to interact with the C# application. In order to realize the graphic interface for the user, eight classes from the System.Windows.Forms package are used.
- ArduinoConnexion class: establishes connection with the Arduino script and the transmission of the data acquired from the sensors.
- UnityConnexion class: establishes the connection with the virtual application developed in Unity.
- FuzzyModule class: implements a specific artificial intelligence algorithm [49], based on the fuzzy technique, to control the robot. For this purpose, three classes from the AForge.Fuzzy package are used: FuzzySet, TrapezoidalFunction, and InferenceSystem.
- C#Main class: constitutes the principal class of the C# application, consisting of four objects, i.e., one object of each previously presented class.
User Interface
- manually, by means of buttons and sliders on the user interface;
- automatically, using a control device equipped with sensors that attaches to the user’s upper limb, combined with voice control.
- ▪
- To establish the connection between the user interface and the virtual reality application through the TCP/IP protocol, the “ConnectVRApp” button must be pressed (Figure 12 (2));
- ▪
- Data transmission between the user interface and the virtual reality application starts only after pressing the “StartApp” button (Figure 12 (2));
- ▪
- Control of the robotic system is performed by means of sliders (Figure 12 (3)), as follows:
- ○
- Laparoscope insertion: insert the laparoscope (Figure 2 (9)) into the virtual patient’s body;
- ○
- Control orientation instrument 1: control the orientation of instrument 1 (Figure 2 (5));
- ○
- Insert instrument 1: inserting instrument 1 into the body of the virtual patient (Figure 2 (6));
- ○
- Control orientation instrument 2: control the orientation of instrument 2 (Figure 2 (7));
- ○
- Insert instrument 2: inserting instrument 2 into the body of the virtual patient (Figure 2 (8));
- ○
- Kinematic chain control: are controlled the kinematic chains of the robotic system structure (Figure 2 (3–5));
- ▪
- For visualization in the virtual reality application from several angles, five viewing cameras can be set (Cam1...Cam5), and organ visualization in the virtual patient’s body is enabled by pressing the “ON” button (Figure 12 (4));
- ▪
- By means of the sliders, the user can control the robotic system to insert the laparoscope and the two active instruments within a recorded time in defined points that are positioned on the kidneys, and when the three points are reached, the stopwatch stops and the elapsed time is recorded in a file (Figure 12 (5));
- ▪
- Setting values for three virtual sensors (heart rate (HR), temperature, and oxygen(SpO2)) that are attached to the virtual patient (Figure 12 (6));
- ▪
- Fields in which messages are displayed for monitor virtual sensor values and collisions when inserting instruments into the virtual patients.
- ▪
- In order to create the connection between the microcontroller and the C# application (user interface) found on the computer via the Wi-Fi network protocol, the “ConnectionESP32” button must be pressed, and from that moment the “DisconnectionESP32” (Figure 13 (2)) status appears on the button on a red background;
- ▪
- By pressing the “Start” button, the “Stop” status appears on the button (Figure 13 (2)), and from that moment the data from the microcontroller are transmitted to the C# application. For optimal functioning of the sensors, they are calibrated by moving the sensors of the device attached to the user’s upper limb on three axes; when the calibration is successfully executed in the “Calibration status” field, the status “ON” (Figure 13 (2)) appears on a yellow background. After the calibration is successfully accomplished, the sensors can be used to control the robotic system.
- ▪
- In order to make a connection between the C# application and the virtual reality application via the TCP/IP protocol, the “ConnectVRApp” button must be pressed, and the virtual reality application starts, and from that moment the connection is created, and the status of the button reads “Disconnection” on a red background (Figure 13 (3)). In order to make data communication in both directions between the C# application and the virtual reality application, the “StartApp” button must be pressed, and from that moment the data communication starts, and the status of the button becomes “Stop” on a red background (Figure 13 (3)).
- ▪
- In order to be able to use the control device with sensors that are attached to the user’s upper limb, the “Start Control Speech rec.” button must be pressed, and after pressing the button, its color changes to yellow-green (Figure 13 (4)). Pressing the button starts the stopwatch (Figure 13 (5)) and activates voice recognition commands that are combined with the sensors’ device commands to control the robotic system from within the virtual reality application. When activating the command via voice recognition, the color of the activated button changes from blue to yellow (Figure 13 (4)). The following describes these commands as follows:
- ○
- KCC are controlled the kinematic chains of the robotic system structure (Figure 2 (3–5));
- ○
- Lap: laparoscope control (Figure 2 (9));
- ○
- CM 1: the module for controlling the rotation and insertion of the instrument 1 (Figure 2 (5,6));
- ○
- CM 2: the module for controlling the rotation and insertion of the instrument 2 (Figure 2 (7,8));
- ○
- Stop C: Stop control of the robotic system (Figure 13 (4));
- ○
- Cam 1…Cam5: Five viewing cameras are used for different angles in the virtual reality application (Figure 13 (4));
- ○
- Organs Visualization ON: command used to visualize the internal organs of the virtual human patient (Figure 13 (4));
- ▪
- Upon touching each point positioned on the kidney by the laparoscope and the two active instruments, one LED lights up, and when both points have been successfully touched, the stopwatch stops (Figure 13 (5)), and the time is recorded in a file.
- ▪
- Field for setting sensor (Figure 13 (6)) values (heart rate, temperature, and oxygen) that are attached to the virtual human patient;
- ▪
- Field where messages are displayed regarding the condition of the virtual human patient from a medical point of view (Figure 13 (7)) by monitoring using three sensors (heart rate, temperature, and oxygen). In this field, messages alerting the user of the robotic system regarding the detection of collisions that may occur between the active instruments and the internal organs of the virtual human patient during the surgical procedure are also displayed.
Program for ESP32 Microcontroller
- utility/imumaths.h: library for mathematical methods;
- Adafruit_Sensor.h, Adafruit_BNO055.h: library for BNO055 sensor use;
- Wi-Fi.h: library use for Wi-Fi network protocol;
- Wire.h: library for communication of ESP32 microcontroller with BNO055 IMU sensor.
3.3.3. Artificial Intelligence Based on Fuzzy Logic for Detecting and Avoiding Unforeseen Events
- For the first system, three virtual sensors (heart rate, body temperature, and blood oxygen level) were attached to the virtual patient and configured in such a way as to obtain distinct information about the patient’s biological signals by emitting visual signals of alarm (Figure 13 (7)) if these parameters are about to change in such a way that the patient’s life is endangered. Furthermore, a series of relationships between the parameters of these signals was created, starting from the premise that the change in the values of a signal can lead to the change in the values of another biological signal important for the safety of the patient, for example, the decrease in the level of oxygen in the blood can lead to tachycardia. The system can be integrated as suggestive behavior in the control of the robotic system, considering that any change in the patient’s medical condition can reconfigure the command the robot receives. The system architecture consists of three inputs: heart rate signal, temperature, and blood oxygen level. For prediction, the system has an output to display future events, as can be seen in Figure 15.
- Between values 0 and 10, the output is “Danger Bradycardia”;
- Between values 15 and 35, the output is “Bradycardia Alert!”;
- Between values 40 and 60, the output is “Biological signals are within normal parameters”;
- Between values 65 and 80, the output is “Tachycardia Alert!”;
- Between values 85 and 100, the output is “Danger Tachycardia”.
- The second fuzzy system is used to detect collisions between the active tools and the organs of the virtual human patient (Figure 16). During the control of the robotic system when the three instruments are inserted into the body of the virtual human patient, these instruments may touch other organs than the desired ones and, as such, sets of rules are implemented that alert the user of the robotic system through messages (Figure 13 (7)) when an unwanted collision with another organ is happening or is about to happen.
- Between values 0 and 20, the output is “No collision occurs”;
- Between values 30 and 45, the output is “Danger: Rib collision!”;
- Between values 55 and 60, the output is “Danger: Organ collision!”;
- Between values 80 and 100, the output is “Danger: Rib and Organ collision!”.
3.3.4. Virtual Reality Application
- ▪
- ▪
- Lap: the command is activated by voice recognition, the background becomes yellow (Figure 13 (4)), and the user can control the insertion and removal of the laparoscope (Figure 2 (9)) by a movement of rotating the upper limb in the vertical plane (parallel with the plane yOz) around the axis Ox of the upper limb (Figure 18b,c);
- ▪
- CM 1: the background becomes yellow (Figure 13 (4)), and the user can control rotation module 1 (Figure 2 (5)) by a rotation movement of the upper limb in the vertical plane (parallel to the plane yOz) around the Ox axis (Figure 18e). After positioning rotation mode 2, the user can insert and remove active instrument 2 (Figure 2 (6)) through a movement of rotation of the upper limb in the vertical plane (parallel to the yOz plane) around the axis Ox of the upper limb (Figure 18b,c);
- ▪
- CM 2: the background becomes yellow (Figure 13(4)), and the user can control rotation module 2 (Figure 2 (7)) through a movement of rotation of the upper limb in the vertical plane (parallel to the plane yOz) around the Ox axis (Figure 18,f). After positioning rotation mode 2, the user can insert and remove active instrument 2 (Figure 2 (8)) through a movement of rotation of the upper limb in the vertical plane (parallel to the yOz plane) around the axis Ox of the upper limb (Figure 18b,c);
- ▪
- Stop C: the background becomes yellow (Figure 13 (4)), and the user stops controlling the robotic system;
- ▪
- Cam1…Cam5: the background becomes yellow, and the user can change the viewing angle of the cameras;
- ▪
- Organs Visualization ON: the background becomes yellow (Figure 13 (4)), and through this command the user can visualize the internal organs of the virtual human patient.
4. Results and Discussion
4.1. Experimental Validation
4.1.1. Participants
4.1.2. Performance Evolution
5. Summary and Conclusions
6. Patents
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Benway, B.M.; Bhayani, S.B.; Rogers, C.G.; Dulabon, L.M.; Patel, M.N.; Lipkin, M.; Wang, A.J.; Stifelman, M.D. Robot Assisted Partial Nephrectomy Versus Laparoscopic Partial Nephrectomy for Renal Tumors: A Multi-Institutional Analysis of Perioperative Outcomes. J. Urol. 2009, 182, 866–873. [Google Scholar] [CrossRef] [PubMed]
- Tucan, P.; Vaida, C.; Horvath, D.; Caprariu, A.; Burz, A.; Gherman, B.; Iakab, S.; Pisla, D. Design and Experimental Setup of a Robotic Medical Instrument for Brachytherapy in Non-Resectable Liver Tumors. Cancers 2022, 14, 5841. [Google Scholar] [CrossRef] [PubMed]
- Plitea, N.; Hesselbach, J.; Vaida, C.; Raatz, A.; Pisla, D.; Budde, C.; Vlad, L.; Burisch, A.; Senner, R. Innovative development of surgical parallel robots. Acta Electron. Mediamira Sci. Cluj Napoca 2007, 4, 201–206. [Google Scholar]
- Pugin, F.; Bucher, P.; Morel, P. History of robotic surgery: From AESOP® and ZEUS® to da Vinci®. J. Visc. Surg. 2011, 148, 3–8. [Google Scholar] [CrossRef] [PubMed]
- Arkenbout, E.A.; Henselmans, P.W.J.; Jelínek, F.; Breedveld, P. A state of the art review and categorization of multi-branched instruments for NOTES and SILS. Surg. Endosc. 2015, 29, 1281–1296. [Google Scholar] [CrossRef]
- Vasudevan, M.K.; Isaac, J.H.R.; Sadanand, V.; Muniyandi, M. Novel virtual reality based training system for fine motor skills: Towards developing a robotic surgery training system. Int. J. Med. Robot. Comput. Assist. Surg. 2020, 16, 1–14. [Google Scholar] [CrossRef]
- Hagmann, K.; Hellings-Kuß, A.; Klodmann, J.; Richter, R.; Stulp, F.; Leidner, D. A Digital Twin Approach for Contextual Assistance for Surgeons During Surgical Robotics Training. Front. Robot. AI 2021, 8, 1–14. [Google Scholar] [CrossRef]
- Vaida, C.; Pisla, D.; Plitea, N.; Gherman, B.; Gyurka, B.; Graur, F.; Vlad, L. Development of a Voice Controlled Surgical Robot. In New Trends in Mechanism Science. Mechanisms and Machine Science; Pisla, D., Ceccarelli, M., Husty, M., Corves, B., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 5, pp. 567–574. [Google Scholar]
- Gleason, A.; Servais, E.; Quadri, S.; Manganiello, M.; Cheah, Y.L.; Simon, C.; Preston, E.; Graham-Stephenson, A.; Wright, V. Developing basic robotic skills using virtual reality simulation and automated assessment tools: A multidisciplinary robotic virtual reality-based curriculum using the Da Vinci Skills Simulator and tracking progress with the Intuitive Learning platform. J. Robot. Surg. 2022, 16, 1313–1319. [Google Scholar] [CrossRef]
- Iop, A.; El-Hajj, V.G.; Gharios, M.; de Giorgio, A.; Monetti, F.M.; Edström, E.; Elmi-Terander, A.; Romero, M. Extended Reality in Neurosurgical Education: A Systematic Review. Sensors 2022, 22, 6067. [Google Scholar] [CrossRef]
- Korayem, M.; Vahidifar, V. Detecting hand’s tremor using leap motion controller in guiding surgical robot arms and laparoscopic scissors. Measurement 2022, 204, 1–11. [Google Scholar] [CrossRef]
- Mehrfard, A.; Fotouhi, J.; Forster, T.; Taylor, G.; Fer, D.; Nagle, D.; Armand, M.; Navab, N.; Fuerst, B. On the effectiveness of virtual reality-based training for surgical robot setup. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2020, 9, 1–10. [Google Scholar] [CrossRef]
- Mishra, R.; Narayanan, M.D.; Umana, G.E.; Montemurro, N.; Chaurasia, B.; Deora, H. Virtual Reality in Neurosurgery: Beyond Neurosurgical Planning. Int. J. Environ. Res. Public Health 2022, 19, 1719. [Google Scholar] [CrossRef] [PubMed]
- Covaciu, F.; Pisla, A.; Vaida, C.; Gherman, B.; Pisla, D. Development of a Virtual Reality Simulator for a Lower Limb Rehabilitation Robot. In Proceedings of the IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR), Cluj-Napoca, Romania, 21–23 May 2020. [Google Scholar]
- Covaciu, F.; Gherman, B.; Pisla, A.; Carbone, G.; Pisla, D. Rehabilitation System with Integrated Visual Stimulation. In Proceedings of the European Conference on Mechanism Science, Cluj-Napoca, Romania, 7–10 September 2020. [Google Scholar]
- Korayem, M.; Madihi, M.; Vahidifar, V. Controlling surgical robot arm using leap motion controller with Kalman filter. Measurement 2021, 178, 109372. [Google Scholar] [CrossRef]
- Ehrampoosh, A.; Shirinzadeh, B.; Pinskier, J.; Smith, J.; Moshinsky, R.; Zhong, Y. A Force-Feedback Methodology for Teleoperated Suturing Task in Robotic-Assisted Minimally Invasive Surgery. Sensors 2022, 22, 7829. [Google Scholar] [CrossRef] [PubMed]
- Chua, Z.; Okamura, A.M. A Modular 3-Degrees-of-Freedom Force Sensor for Robot-Assisted Minimally Invasive Surgery Research. Sensors 2023, 23, 5230. [Google Scholar] [CrossRef]
- Abad, A.C.; Reid, D.; Ranasinghe, A. A Novel Untethered Hand Wearable with Fine-Grained Cutaneous Haptic Feedback. Sensors 2022, 22, 1924. [Google Scholar] [CrossRef]
- Pisla, D.; Gherman, B.; Plitea, N.; Gyurka, B.; Vaida, C.; Vlad, L.; Graur, F.; Radu, C.; Suciu, M.; Szilaghi, A.; et al. PARASURG hybrid parallel robot for minimally invasive surgery. Chirurgia 2011, 106, 619–625. [Google Scholar]
- Martin, J.R.; Stefanidis, D.; Dorin, R.P.; Goh, A.C.; Satava, R.M.; Levy, J.S. Demonstrating the effectiveness of the fundamentals of robotic surgery (FRS) curriculum on the RobotiX Mentor Virtual Reality Simulation Platform. J. Robot. Surg. 2020, 15, 187–193. [Google Scholar] [CrossRef]
- Covaciu, F.; Pisla, A.; Iordan, A.-E. Development of a Virtual Reality Simulator for an Intelligent Robotic System Used in Ankle Rehabilitation. Sensors 2021, 21, 1537. [Google Scholar] [CrossRef]
- Covaciu, F.; Iordan, A.-E. Control of a Drone in Virtual Reality Using MEMS Sensor Technology and Machine Learning. Micromachines 2022, 13, 521. [Google Scholar] [CrossRef]
- Luca, A.; Giorgino, R.; Gesualdo, L.; Peretti, G.M.; Belkhou, A.; Banfi, G.; Grasso, G. Innovative Educational Pathways in Spine Surgery: Advanced Virtual Reality–Based Training. World Neurosurg. 2020, 140, 674–680. [Google Scholar] [CrossRef]
- Portelli, M.; Bianco, S.; Bezzina, T.; Abela, J. Virtual reality training compared with apprenticeship training in laparoscopic surgery: A meta-analysis. R. Coll. Surg. Engl. 2020, 102, 672–684. [Google Scholar] [CrossRef] [PubMed]
- Trochimczuk, R.; Łukaszewicz, A.; Mikołajczyk, T.; Aggogeri, F.; Borboni, A. Finite element method stiffness analysis of a novel telemanipulator for minimally invasive surgery. Simulation 2019, 95, 1015–1025. [Google Scholar] [CrossRef] [Green Version]
- Kawashima, K.; Kanno, T.; Tadano, K. Robots in laparoscopic surgery: Current and future status. BMC Biomed. Eng. 2019, 1, 1–6. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Longmore, S.K.; Naik, G.; Gargiulo, G.D. Laparoscopic Robotic Surgery: Current Perspective and Future Directions. Robotics 2020, 9, 42. [Google Scholar] [CrossRef]
- Korayem, M.; Vosoughi, R.; Vahidifar, V. Design, manufacture, and control of a laparoscopic robot via Leap Motion sensors. Measurement 2022, 205, 1–13. [Google Scholar] [CrossRef]
- Batty, T.; Ehrampoosh, A.; Shirinzadeh, B.; Zhong, Y.; Smith, J. A Transparent Teleoperated Robotic Surgical System with Predictive Haptic Feedback and Force Modelling. Sensors 2022, 22, 9770. [Google Scholar] [CrossRef]
- Mao, R.Q.; Lan, L.; Kay, J.; Lohre, R.; Ayeni, O.R.; Goel, D.P.; de Sa, D. Immersive Virtual Reality for Surgical Training: A Systematic Review. J. Surg. Res. 2021, 268, 40–58. [Google Scholar] [CrossRef]
- Kalinov, T.; Georgiev, T.; Bliznakova, K.; Zlatarov, A.; Kolev, N. Assessment of students’ satisfaction with virtual robotic surgery training. Heliyon 2023, 9, 1–8. [Google Scholar] [CrossRef]
- Lamblin, G.; Thiberville, G.; Druette, L.; Moret, S.; Couraud, S.; Martin, X.; Dubernard, G.; Chene, G. Virtual reality simulation to enhance laparoscopic salpingectomy skills. J. Gynecol. Obstet. Hum. Reprod. 2020, 49, 101685. [Google Scholar] [CrossRef]
- Elessawy, M.; Mabrouk, M.; Heilmann, T.; Weigel, M.; Zidan, M.; Abu-Sheasha, G.; Farrokh, A.; Bauerschlag, D.; Maass, N.; Ibrahim, M.; et al. Evaluation of Laparoscopy Virtual Reality Training on the Improvement of Trainees’ Surgical Skills. Medicina 2021, 57, 130. [Google Scholar] [CrossRef] [PubMed]
- Gherman, B.; Vaida, C.; Pisla, D.; Plitea, N.; Gyurka, B.; Lese, D.; Glogoveanu, M. Singularities and Workspace Analysis for a Parallel Robot for Minimally Invasive Surgery. In Proceedings of the IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR), Cluj-Napoca, Romania, 28–30 May 2010. [Google Scholar]
- Wenger, P.; Chablat, D. A Review of Cuspidal Serial and Parallel Manipulators. ASME J. Mech. Robotics. 2023, 15, 040801. [Google Scholar] [CrossRef]
- Pisla, D.; Plitea, N.; Videan, A.; Prodan, B.; Gherman, B.; Lese, D. Kinematics and design of two variants of a reconfigurable parallel robot. In Proceedings of the ASME/IFToMM International Conference on Reconfigurable Mechanisms and Robots, London, UK, 24 July 2009. [Google Scholar]
- Franklin, C.S.; Dominguez, E.G.; Fryman, J.D.; Lewandowski, M.L. Collaborative robotics: New era of human–robot cooperation in the workplace. J. Saf. Res. 2020, 74, 153–160. [Google Scholar] [CrossRef] [PubMed]
- Tucan, P.; Vaida, C.; Plitea, N.; Pisla, A.; Carbone, G.; Pisla, D. Risk-Based Assessment Engineering of a Parallel Robot Used in Post-Stroke Upper Limb Rehabilitation. Sustainability 2019, 11, 2893. [Google Scholar] [CrossRef] [Green Version]
- Merlet, J.-P. Parallel Robots, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Pisla, D.; Gherman, B.; Tucan, P.; Birlescu, I.; Pusca, A.; Rus, G.; Pisla, A.; Vaida, C. Application oriented modelling and simulation of an innovative parallel robot for single incision laparoscopic surgery. In Proceedings of the ASME, St. Louis, MI, USA, 14–17 August 2022; pp. 1–10. [Google Scholar]
- Pisla, D.; Pusca, A.; Tucan, P.; Gherman, B.; Vaida, C. Kinematics and workspace analysis of an innovative 6-dof parallel robot for SILS. Proc. Rom. Acad. 2022, 23, 279–288. [Google Scholar]
- Available online: https://ro.mouser.com/new/bosch/bosch-bno55-sensor/ (accessed on 5 January 2022).
- Available online: https://www.espressif.com/sites/default/files/documentation/esp32_datasheet_en.pdf (accessed on 5 January 2022).
- Iordan, A.E.; Covaciu, F. Improving Design of a Triangle Geometry Computer Application using a Creational Pattern. Acta Tech. Napoc. Ser.-Appl. Math. Mech. Eng. 2020, 63, 73–78. [Google Scholar]
- Iordan, A.E. Optimal solution of the Guarini puzzle extension using tripartite graphs. IOP Conf. Ser. Mater. Sci. Eng. 2019, 477, 1–8. [Google Scholar] [CrossRef]
- Levitin, A. Algorithmic Puzzles: History, Taxonomies, and Applications in Human Problem Solving. J. Probl. Solving 2017, 10, 1. [Google Scholar] [CrossRef] [Green Version]
- Iordan, A. Development of Interactive Software for Teaching Three-Dimensional Analytic Geometry. In Proceedings of the 9th International Conference on Distance Learning and Web Engineering, Budapest, Hungary, 3–5 September 2009. [Google Scholar]
- Panoiu, M.; Panoiu, C.; Iordan, A.; Ghiormez, L. Artificial neural networks in predicting current in electric arc furnaces. IOP Conf. Ser. Mater. Sci. Eng. 2014, 57, 1–7. [Google Scholar] [CrossRef] [Green Version]
- Dumitrescu, C.; Ciotirnae, P.; Vizitiu, C. Fuzzy Logic for Intelligent Control System Using Soft Computing Applications. Sensors 2021, 21, 2617. [Google Scholar] [CrossRef]
- Dong, B.; Luo, Z.; Lu, J.; Yang, Y.; Song, Y.; Cao, J.; Li, W. Single-incision laparoscopic versus conventional laparoscopic right colectomy: A systematic review and meta-analysis. Int. J. Surg. 2018, 55, 31–38. [Google Scholar] [CrossRef] [PubMed]
- Pisla, D.; Carami, D.; Gherman, B.; Soleti, G.; Ulinici, I.; Vaida, C. A novel control architecture for robotic-assisted single incision laparoscopic surgery. Rom. J. Tech. Sci. Appl. Mech. 2021, 66, 141–162. [Google Scholar]
- Aydın, A.; Ahmed, K.; Abe, T.; Raison, N.; Van Hemelrijck, M.; Garmo, H.; Ahmed, H.U.; Mukhtar, F.; Al-Jabir, A.; Brunckhorst, O.; et al. Effect of Simulation-based Training on Surgical Proficiency and Patient Outcomes: A Randomised Controlled Clinical and Educational Trial. Eur. Urol. 2021, 81, 385–393. [Google Scholar] [CrossRef] [PubMed]
Subject | Age | Gender |
---|---|---|
1 | 42 | m |
2 | 25 | f |
3 | 22 | f |
4 | 43 | m |
5 | 35 | m |
6 | 36 | m |
7 | 43 | f |
8 | 24 | f |
9 | 22 | m |
10 | 43 | m |
11 | 26 | m |
12 | 23 | f |
13 | 30 | m |
14 | 24 | f |
15 | 27 | m |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Covaciu, F.; Crisan, N.; Vaida, C.; Andras, I.; Pusca, A.; Gherman, B.; Radu, C.; Tucan, P.; Al Hajjar, N.; Pisla, D. Integration of Virtual Reality in the Control System of an Innovative Medical Robot for Single-Incision Laparoscopic Surgery. Sensors 2023, 23, 5400. https://doi.org/10.3390/s23125400
Covaciu F, Crisan N, Vaida C, Andras I, Pusca A, Gherman B, Radu C, Tucan P, Al Hajjar N, Pisla D. Integration of Virtual Reality in the Control System of an Innovative Medical Robot for Single-Incision Laparoscopic Surgery. Sensors. 2023; 23(12):5400. https://doi.org/10.3390/s23125400
Chicago/Turabian StyleCovaciu, Florin, Nicolae Crisan, Calin Vaida, Iulia Andras, Alexandru Pusca, Bogdan Gherman, Corina Radu, Paul Tucan, Nadim Al Hajjar, and Doina Pisla. 2023. "Integration of Virtual Reality in the Control System of an Innovative Medical Robot for Single-Incision Laparoscopic Surgery" Sensors 23, no. 12: 5400. https://doi.org/10.3390/s23125400
APA StyleCovaciu, F., Crisan, N., Vaida, C., Andras, I., Pusca, A., Gherman, B., Radu, C., Tucan, P., Al Hajjar, N., & Pisla, D. (2023). Integration of Virtual Reality in the Control System of an Innovative Medical Robot for Single-Incision Laparoscopic Surgery. Sensors, 23(12), 5400. https://doi.org/10.3390/s23125400