Surgical Instrument Signaling Gesture Recognition Using Surface Electromyography Signals
Abstract
:1. Introduction
2. Surgical Instrument Signaling
- Compress (Figure 1-1): Hand is flattened with fingers together and the palmar surface facing upwards;
- Medical Thread on a Spool (Figure 1-2): Palm’s hand faces upwards with fingers semi-flexed;
- Medical Thread Loose (Figure 1-3): Hand is palm down with fingers semi-flexed;
- Plier Backhaus (Figure 1-4): Ring, middle, and index fingers are flexed while the thumb interposes the index and middle fingers;
- Hemostatic Forceps (Figure 1-5): Index and middle fingers are crossed with the palm side of the hand facing down;
- Kelly Hemostatic Forceps (Figure 1-6): Ring and little fingers are flexed, and the other fingers are extended;
- Farabeuf Retractor (Figure 1-7): Index finger is semi-flexed, and the other fingers are flexed; the hand moves similarly to handling the instrument;
- Bistouri (Figure 1-8): All fingers are semi-flexed and gathered at the tips, performing a pendular movement, similar to the movement performed when handling the instrument;
- Needle Holder (Figure 1-9): Index, middle, ring, and pinkie fingers are semi-flexed, and the thumb is partially flexed on the opposite side; the hand performs small rotational movements;
- Valve Doyen (Figure 1-10): Hand moves with all fingers together; the fingers are stretched out and at right angles to the rest of the hand;
- Allis Clamp (Figure 1-11): Thumb and index fingers are semi-flexed, making an opening and closing movement with the thumb holding the index finger, while the other fingers remain flexed;
- Anatomical Tweezers (Figure 1-12): Thumb and index fingers are extended, making an approach and removal movement, while the other fingers remain flexed;
- Rat’s Tooth Forceps (Figure 1-13): Thumb and index fingers are semi-flexed (making an opening and closing movement with the ends of the index finger and thumb touching each other), and the other fingers remain flexed;
- Scissors (Figure 1-14): Index and middle fingers are kept extended.
3. Methodology
- AR: 4th-, 6th-, 9th-, and 15th-order coefficients;
- CC: 4th- and 9th-order coefficients;
- HIST: Nine bins;
- LS: Second moment;
- SampEn: Dimension = 2 and R = 0.2.
- KNN: Euclidean distance and 1 nearest neighbour;
- RF: Thirty trees;
- SVM: Radial basis function, C = 10, Gaussian size = 0.1, one-versus-one method;
- MLP: Learning rate = 0.0025, 1 hidden layer with 30 neurons, hyperbolic tangent function on hidden layer and logistic function on output layer, training by backpropagation method, stop criterion based on number of epochs, and precision of mean squared error (10−7).
4. Results
4.1. Combination of Feature Sets and Classifiers
4.2. Ensemble and Gesture Analysis
4.3. Volunteers’ Analysis
5. Discussion
Comparison with Related Works
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ANN | Artificial Neural Network |
AR | Autoregressive Coefficients |
ART | Assertiveness Rate |
C | Coefficient of optimization of SVM |
CC | Cepstral Coefficients |
CNN | Convolutional Neural Network |
csv | comma separated file |
DASDV | Difference Absolute Standard Deviation Value |
DT | Decision Tree |
Ens_a | Ensemble with automatic search |
Ens_m | Ensemble with manual search |
HIST | Histogram |
IAV | Integral of Average Value |
KNN | k-Nearest Neighbour |
LD | Log Detector |
LDA | Linear Discriminant Analysis |
LS | L-Scale |
MAV | Mean Absolute Value |
MFL | Maximum Fractal Length |
MLP | Multi Layer Perceptron |
MNP | Mean Power |
MSR | Mean Square Rate |
QDA | Quadratic Discriminant Analysis |
RF | Random Forest |
RMS | Root Mean Square |
SampEn | Sample Entropy |
sEMG | Surface Electromyography |
SIS | Surgical Instrument Signaling |
SSC | Sign Slope Change |
SVM | Support Vector Machine |
TTP | Total Power |
USB | Universal Serial Bus |
VAR | Variance |
WAMP | Willison Amplitude |
WL | Waveform Length |
ZC | Zero Crossing |
References
- Karjalainen, M.; Kontunen, A.; Anttalainen, A.; Mäkelä, M.; Varga, S.; Lepomäki, M.; Anttalainen, O.; Kumpulainen, P.; Oksala, N.; Roine, A.; et al. Characterization of signal kinetics in real time surgical tissue classification system. Sens. Actuators B Chem. 2022, 365, 131902. [Google Scholar] [CrossRef]
- Jacob, M.; Li, Y.T.; Akingba, G.; Wachs, J.P. Gestonurse: A robotic surgical nurse for handling surgical instruments in the operating room. J. Robot. Surg. 2012, 6, 53–63. [Google Scholar] [CrossRef] [PubMed]
- Nemitz, R. Surgical Instrumentation, 3rd ed.; Elsevier Inc.: St. Louis, MO, USA, 2018. [Google Scholar]
- Phillips, N. Surgical Instrumentation, 2nd ed.; Cengage Learning: Clifton Park, NY, USA; Andover, MA, USA, 2018. [Google Scholar]
- Smith, C.J.; Rane, R.; Melendez, L. Operating Room. In Clinical Engineering Handbook; Dyro, J.F., Ed.; Biomedical Engineering, Academic Press: Cambridge, MA, USA, 2004; pp. 376–384. [Google Scholar] [CrossRef]
- Yang, T.; Da Silva, H.B.; Sekhar, L.N. Surgical Positioning, Navigation, Important Surgical Tools, Craniotomy, and Closure of Cranial and Spinal Wounds. In Principles of Neurological Surgery, 4th ed.; Ellenbogen, R.G., Sekhar, L.N., Kitchen, N.D., da Silva, H.B., Eds.; Elsevier: Amsterdam, The Netherlands, 2018; pp. 103–115.e1. [Google Scholar] [CrossRef]
- Peters, B.S.; Armijo, P.R.; Krause, C.; Choudhury, S.A.; Oleynikov, D. Review of emerging surgical robotic technology. Surg. Endosc. 2018, 32, 1636–1655. [Google Scholar] [CrossRef] [PubMed]
- Angelini, L.; Papaspyropoulos, V. Telesurgery. Ultrasound Med. Biol. 2000, 26, S45–S47. [Google Scholar] [CrossRef] [PubMed]
- Bradley, S. Human computer interfaces for telesurgery. In Proceedings of the IEE Colloquium on Towards Telesurgery, London, UK, 20 June 1995; pp. 5/1–5/5. [Google Scholar] [CrossRef]
- Choi, P.J.; Oskouian, R.J.; Tubbs, R.S.; Choi, P.J.K.; Oskouian, R.J.; Tubbs, R.S. Telesurgery: Past, Present, and Future. Cureus 2018, 10, e2716. [Google Scholar] [CrossRef] [Green Version]
- Le, H.T.; Pham, H.T.T. Hand Signal Recognition for Handling Surgical Instruments. In Proceedings of the 6th International Conference on the Development of Biomedical Engineering in Vietnam (BME6); Vo Van, T., Nguyen Le, T.A., Nguyen Duc, T., Eds.; IFMBE Proceedings; Springer: Singapore, 2018; pp. 587–592. [Google Scholar] [CrossRef]
- Murillo, P.C.U.; Arenas, J.O.P.; Moreno, R.J. Tree-Structured CNN for the Classification of Surgical Instruments. In Proceedings of the Intelligent Computing Systems, Merida, Mexico, 21–23 March 2018; Brito-Loeza, C., Espinosa-Romero, A., Eds.; Communications in Computer and Information Science; Springer: Cham, Switzerland, 2018; pp. 15–30. [Google Scholar] [CrossRef]
- Luongo, F.; Hakim, R.; Nguyen, J.H.; Anandkumar, A.; Hung, A.J. Deep learning-based computer vision to recognize and classify suturing gestures in robot-assisted surgery. Surgery 2021, 169, 1240–1244. [Google Scholar] [CrossRef]
- Ebrahim Al-Ahdal, M.; Nooritawati, M.T. Review in Sign Language Recognition Systems. In Proceedings of the 2012 IEEE Symposium on Computers & Informatics (ISCI), Penang, Malaysia, 18–20 March 2012; pp. 52–57. [Google Scholar] [CrossRef]
- López-Casado, C.; Bauzano, E.; Rivas-Blanco, I.; Pérez-del Pulgar, C.J.; Muñoz, V.F. A Gesture Recognition Algorithm for Hand-Assisted Laparoscopic Surgery. Sensors 2019, 19, 5182. [Google Scholar] [CrossRef] [Green Version]
- Bieck, R.; Fuchs, R.; Neumuth, T. Surface EMG-based Surgical Instrument Classification for Dynamic Activity Recognition in Surgical Workflows. Curr. Dir. Biomed. Eng. 2019, 5, 37–40. [Google Scholar] [CrossRef]
- Jacob, M.G.; Li, Y.T.; Wachs, J.P. A gesture driven robotic scrub nurse. In Proceedings of the 2011 IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, USA, 9–12 October 2011; pp. 2039–2044. [Google Scholar] [CrossRef]
- Jacob, M.G.; Li, Y.T.; Wachs, J.P. Surgical instrument handling and retrieval in the operating room with a multimodal robotic assistant. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 2140–2145. [Google Scholar] [CrossRef]
- Jacob, M.G.; Li, Y.T.; Wachs, J.P. Gestonurse: A multimodal robotic scrub nurse. In Proceedings of the 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Boston, MA, USA, 5–8 March 2012; pp. 153–154. [Google Scholar] [CrossRef]
- Perez-Vidal, C.; Carpintero, E.; Garcia-Aracil, N.; Sabater-Navarro, J.M.; Azorin, J.M.; Candela, A.; Fernandez, E. Steps in the development of a robotic scrub nurse. Robot. Auton. Syst. 2012, 60, 901–911. [Google Scholar] [CrossRef]
- Zhou, T.; Wachs, J.P. Needle in a haystack: Interactive surgical instrument recognition through perception and manipulation. Robot. Auton. Syst. 2017, 97, 182–192. [Google Scholar] [CrossRef]
- Qi, W.; Liu, X.; Zhang, L.; Wu, L.; Zang, W.; Su, H. Adaptive sensor fusion labeling framework for hand pose recognition in robot teleoperation. Assem. Autom. 2021, 41, 393–400. [Google Scholar] [CrossRef]
- Kamen, G.; Gabriel, D. Essentials of Electromyography; Human Kinetics: Champaign, IL, USA, 2010. [Google Scholar]
- De Luca, C. Electromyography. In Encyclopedia of Medical Devices and Instrumentation, 2nd ed.; John Wiley Publisher: New York, NY, USA, 2006; Volume 2, p. 3666. [Google Scholar]
- Mendes Junior, J.J.A.; Freitas, M.L.B.; Campos, D.P.; Farinelli, F.A.; Stevan, S.L.; Pichorim, S.F. Analysis of Influence of Segmentation, Features, and Classification in sEMG Processing: A Case Study of Recognition of Brazilian Sign Language Alphabet. Sensors 2020, 20, 4359. [Google Scholar] [CrossRef]
- Cheok, M.J.; Omar, Z.; Jaward, M.H. A review of hand gesture and sign language recognition techniques. Int. J. Mach. Learn. Cybern. 2019, 10, 131–153. [Google Scholar] [CrossRef]
- Zheng, M.; Crouch, M.S.; Eggleston, M.S. Surface Electromyography as a Natural Human–Machine Interface: A Review. IEEE Sens. J. 2022, 22, 9198–9214. [Google Scholar] [CrossRef]
- Chowdhury, R.H.; Reaz, M.B.I.; Ali, M.A.B.M.; Bakar, A.A.A.; Chellappan, K.; Chang, T.G. Surface Electromyography Signal Processing and Classification Techniques. Sensors 2013, 13, 12431–12466. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nsugbe, E.; Phillips, C.; Fraser, M.; McIntosh, J. Gesture recognition for transhumeral prosthesis control using EMG and NIR. IET Cyber-Syst. Robot. 2020, 2, 122–131. Available online: https://onlinelibrary.wiley.com/doi/pdf/10.1049/iet-csr.2020.0008 (accessed on 1 June 2021). [CrossRef]
- Pezzella, A.T. Hand signals in surgery. AORN J. 1996, 63, 769–772. [Google Scholar] [CrossRef]
- Fulchiero, G.J.J.; Vujevich, J.J.; Goldberg, L.H. Nonverbal Hand Signals: A Tool for Increasing Patient Comfort During Dermatologic Surgery. Dermatol. Surg. 2009, 35, 856. [Google Scholar] [CrossRef] [PubMed]
- Moriya, T.; Vicente, Y.A.M.V.D.A.; Tazima, M.D.F.G.S. Instrumental cirúrgico. Medicina 2011, 44, 18–32. [Google Scholar] [CrossRef] [Green Version]
- Visconti, P.; Gaetani, F.; Zappatore, G.A.; Primiceri, P. Technical Features and Functionalities of Myo Armband: An Overview on Related Literature and Advanced Applications of Myoelectric Armbands Mainly Focused on Arm Prostheses. Int. J. Smart Sens. Intell. Syst. 2018, 11, 1–25. [Google Scholar] [CrossRef] [Green Version]
- Corti, N. Myo EMG Visualizer. 2020. [Google Scholar]
- Freitas, M.L.B.; Mendes Junior, J.J.A.; La Banca, W.F.; Stevan, S.L., Jr. Algoritmo de Segmentação com Base em Análise Automática de Limiar para Sinais de Eletromiografia de Superfície. In Proceedings of the Anais do IX Congresso Latino-Americano de Engenharia Biomédica (CLAIB 2022) e XXVIII Congresso Brasileiro de Engenharia de Engenharia Biomédica (CBEB 2022), Florianópolis, SC, USA, 22–24 October 2022; p. 6. [Google Scholar]
- Mendes Junior, J.J.A.; Pontim, C.E.; Dias, T.S.; Campos, D.P. How do sEMG segmentation parameters influence pattern recognition process? An approach based on wearable sEMG sensor. Biomed. Signal Process. Control 2023, 81, 104546. [Google Scholar] [CrossRef]
- Meyer, P.L. Introductory Probability and Statistical Applications; Addison-Wesley: Hoboken, NJ, USA, 1965. [Google Scholar]
- Hudgins, B.; Parker, P.; Scott, R.N. A new strategy for multifunction myoelectric control. IEEE Trans. Bio-Med Eng. 1993, 40, 82–94. [Google Scholar] [CrossRef] [PubMed]
- Huang, Y.; Englehart, K.; Hudgins, B.; Chan, A. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses. IEEE Trans. Biomed. Eng. 2005, 52, 1801–1811. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.H.; Huang, H.P.; Weng, C.H. Recognition of Electromyographic Signals Using Cascaded Kernel Learning Machine. IEEE/ASME Trans. Mechatron. 2007, 12, 253–264. [Google Scholar] [CrossRef]
- Phinyomark, A.; N. Khushaba, R.; Scheme, E. Feature Extraction and Selection for Myoelectric Control Based on Wearable EMG Sensors. Sensors 2018, 18, 1615. [Google Scholar] [CrossRef] [Green Version]
- Tkach, D.; Huang, H.; Kuiken, T.A. Study of stability of time-domain features for electromyographic pattern recognition. J. Neuroeng. Rehabil. 2010, 7, 21. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Phinyomark, A.; Quaine, F.; Charbonnier, S.; Serviere, C.; Tarpin-Bernard, F.; Laurillau, Y. EMG feature evaluation for improving myoelectric pattern recognition robustness. Expert Syst. Appl. 2013, 40, 4832–4840. [Google Scholar] [CrossRef]
- Bhattacharya, A.; Sarkar, A.; Basak, P. Time domain multi-feature extraction and classification of human hand movements using surface EMG. In Proceedings of the 2017 4th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 6–7 January 2017; pp. 1–5. [Google Scholar] [CrossRef]
- Junior, J.J.A.M.; Freitas, M.; Siqueira, H.; Lazzaretti, A.E.; Stevan, S.; Pichorim, S.F. Comparative analysis among feature selection of sEMG signal for hand gesture classification by armband. IEEE Lat. Am. Trans. 2020, 18, 1135–1143. [Google Scholar]
- Siqueira, H.; Macedo, M.; Tadano, Y.d.S.; Alves, T.A.; Stevan, S.L., Jr.; Oliveira, D.S., Jr.; Marinho, M.H.; Neto, P.S.d.M.; Oliveira, J.F.d.; Luna, I.; et al. Selection of temporal lags for predicting riverflow series from hydroelectric plants using variable selection methods. Energies 2020, 13, 4236. [Google Scholar] [CrossRef]
- Belotti, J.; Siqueira, H.; Araujo, L.; Stevan, S.L., Jr.; de Mattos Neto, P.S.; Marinho, M.H.; de Oliveira, J.F.L.; Usberti, F.; Leone Filho, M.d.A.; Converti, A.; et al. Neural-based ensembles and unorganized machines to predict streamflow series from hydroelectric plants. Energies 2020, 13, 4769. [Google Scholar] [CrossRef]
- Siqueira, H.; Santana, C.; Macedo, M.; Figueiredo, E.; Gokhale, A.; Bastos-Filho, C. Simplified binary cat swarm optimization. Integr. Comput.-Aided Eng. 2021, 28, 35–50. [Google Scholar] [CrossRef]
- de Mattos Neto, P.S.; de Oliveira, J.F.; Júnior, D.S.d.O.S.; Siqueira, H.V.; Marinho, M.H.; Madeiro, F. An adaptive hybrid system using deep learning for wind speed forecasting. Inf. Sci. 2021, 581, 495–514. [Google Scholar] [CrossRef]
- Chang, C.C.; Lin, C.J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol. 2011, 2, 1–27. Available online: http://www.csie.ntu.edu.tw/~cjlin/libsvm (accessed on 2 June 2021). [CrossRef]
- Ashraf, H.; Waris, A.; Jamil, M.; Gilani, S.O.; Niazi, I.K.; Kamavuako, E.N.; Gilani, S.H.N. Determination of Optimum Segmentation Schemes for Pattern Recognition-Based Myoelectric Control: A Multi-Dataset Investigation. IEEE Access 2020, 8, 90862–90877. [Google Scholar] [CrossRef]
- Sandoval-Espino, J.A.; Zamudio-Lara, A.; Marbán-Salgado, J.A.; Escobedo-Alatorre, J.J.; Palillero-Sandoval, O.; Velásquez-Aguilar, J.G. Selection of the Best Set of Features for sEMG-Based Hand Gesture Recognition Applying a CNN Architecture. Sensors 2022, 22, 4972. [Google Scholar] [CrossRef] [PubMed]
- Essa, R.R.; Jaber, H.A.; Jasim, A.A. Features selection for estimating hand gestures based on electromyography signals. Bull. Electr. Eng. Inform. 2023, 12, 2087–2094. [Google Scholar] [CrossRef]
- Abbaspour, S.; Lindén, M.; Gholamhosseini, H.; Naber, A.; Ortiz-Catalan, M. Evaluation of surface EMG-based recognition algorithms for decoding hand movements. Med. Biol. Eng. Comput. 2020, 58, 83–100. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Yeung, D.Y. Heteroscedastic Probabilistic Linear Discriminant Analysis with Semi-supervised Extension. In Machine Learning and Knowledge Discovery in Databases; Buntine, W., Grobelnik, M., Mladenić, D., Shawe-Taylor, J., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009; pp. 602–616. [Google Scholar] [CrossRef] [Green Version]
- Akinsola, J.E.T. Supervised Machine Learning Algorithms: Classification and Comparison. Int. J. Comput. Trends Technol. (IJCTT) 2017, 48, 128–138. [Google Scholar] [CrossRef]
- Tomar, D. A survey on Data Mining approaches for Healthcare. Int. J. Bio-Sci. Bio-Technol. 2013, 5, 241–266. [Google Scholar] [CrossRef]
- Pontim, C.E.; Mendes Júnior, J.J.A.; Martins, H.V.P.; Campos, D.P. Impact of sEMG Time-series Segmentation Parameters on the Recognition of Hand Gestures. J. Appl. Instrum. Control 2020, 8, 1–7. [Google Scholar] [CrossRef]
- Phinyomark, A.; Scheme, E. EMG pattern recognition in the era of big data and deep learning. Big Data Cogn. Comput. 2018, 2, 21. [Google Scholar] [CrossRef] [Green Version]
- Su, Z.; Liu, H.; Qian, J.; Zhang, Z.; Zhang, L. Hand gesture recognition based on sEMG signal and convolutional neural network. Int. J. Pattern Recognit. Artif. Intell. 2021, 35, 2151012. [Google Scholar] [CrossRef]
Parameter | Value |
---|---|
Number of gestures | 14 |
Number of volunteers | 10 |
Acquisitions by volunteer | 30 |
Estimated time for each sEMG activation | 2 s |
Assertiveness rate | 80% |
Limit value of amplitude for threshold | 500 aq·u2 |
Feature Set | Features | Reference |
---|---|---|
G1 | MAV, WL, ZC, and SSC | [38] |
G2 | RMS and AR6 | [39] |
G3 | MAV, WL, ZC, SSC, RMS, and AR6 | [39] |
G4 | AR4 and HIST | [40] |
G5 | WL, LD, SSC, and AR9 | [41,42] |
G6 | WL, SSC, AR9, and CC9 | [41,42] |
G7 | RMS, VAR, LD, and HIST | [41,42] |
G8 | WL, RMS, SampEn, and CC4 | [43] |
G9 | AR15, ZC, MAV, RMS, SSC, and WL | [44] |
G10 | MAV and AR4 | [25] |
G11 | MFL, MSR, WAMP, and LS | [41] |
G12 | LS, MFL, MSR, WAMP, ZC, RMS, IAV, DASDV, and VAR | [41] |
G13 | MFL, MNP, TTP, and RMS | [25] |
Best Recognized Gestures | SVM | Ens_a | Ens_m |
---|---|---|---|
1 | 0.97 | 1 | 1 |
3 | 0.83 | 0.87 | 0.87 |
5 | 0.83 | 0.87 | 0.87 |
6 | 0.83 | 0.8 | 0.8 |
10 | 0.83 | 0.83 | 0.83 |
Worst Recognized Gestures | SVM | Ens_a | Ens_m |
4 | 0.63 | 0.63 | 0.63 |
7 | 0.63 | 0.67 | 0.67 |
9 | 0.6 | 0.57 | 0.57 |
12 | 0.65 | 0.47 | 0.47 |
13 | 0.53 | 0.47 | 0.47 |
Best Recognized Gestures | MLP | Ens_a |
---|---|---|
1 | 0.94 | 0.96 |
2 | 0.93 | 0.95 |
3 | 0.93 | 0.93 |
5 | 0.92 | 0.93 |
6 | 0.91 | 0.95 |
Worst Recognized Gestures | MLP | Ens_a |
8 | 0.87 | 0.83 |
9 | 0.87 | 0.81 |
11 | 0.81 | 0.79 |
12 | 0.8 | 0.77 |
13 | 0.79 | 0.79 |
Reference | Objective | Techniques | Gestures/Instruments | Results |
---|---|---|---|---|
[11] | Recognition of instruments and their placement | - Kinect sensor - Image classification | - 5 gestures - Numbers representing instruments | System is viable for classification |
[12] | Recognition of surgical instruments | - Image classification - CNN classifier structured by DT | - 10 instruments - 5 instruments | 10 instruments: 70% 5 instruments: 96% |
[13] | Recognition of suturing gestures | - Image classification - Deep learning | - | Gesture presence detection: 88% Gesture identification: 87% |
[21] | Recognition of instruments placed on holder | - Optical sensor - Image classification - Robotic manipulation | - | 95.6% accuracy |
[22] | Gesture recognition | - EMG: Myo Armband™ - Several classifiers, such as SVM, KNN, and convolutional neural networks | 10 gestures representing numbers from 1 to 10 | 88.5% for deep CNN |
[16] | Recognition of Surgical Instrument Signaling gestures | - EMG: Myo Armband™ - ANN and DT classifiers | 5 instruments and gestures | 1 volunteer: 95.2% (ANN) and 95.3% (DT) 5 volunteers: 70.1% (ANN) and 71.2% (DT) |
This work | Recognition of Surgical Instrument Signaling gestures | - EMG: Myo Armband™ - Several classifiers and ensembles - Feature selection - Comparison with volunteers | 14 SIS gestures | 10 volunteers: 76% (SVM) Individual classification: 88% (ensemble method) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Freitas, M.L.B.; Mendes, J.J.A., Jr.; Dias, T.S.; Siqueira, H.V.; Stevan, S.L., Jr. Surgical Instrument Signaling Gesture Recognition Using Surface Electromyography Signals. Sensors 2023, 23, 6233. https://doi.org/10.3390/s23136233
Freitas MLB, Mendes JJA Jr., Dias TS, Siqueira HV, Stevan SL Jr. Surgical Instrument Signaling Gesture Recognition Using Surface Electromyography Signals. Sensors. 2023; 23(13):6233. https://doi.org/10.3390/s23136233
Chicago/Turabian StyleFreitas, Melissa La Banca, José Jair Alves Mendes, Jr., Thiago Simões Dias, Hugo Valadares Siqueira, and Sergio Luiz Stevan, Jr. 2023. "Surgical Instrument Signaling Gesture Recognition Using Surface Electromyography Signals" Sensors 23, no. 13: 6233. https://doi.org/10.3390/s23136233
APA StyleFreitas, M. L. B., Mendes, J. J. A., Jr., Dias, T. S., Siqueira, H. V., & Stevan, S. L., Jr. (2023). Surgical Instrument Signaling Gesture Recognition Using Surface Electromyography Signals. Sensors, 23(13), 6233. https://doi.org/10.3390/s23136233