sEMG-Based Robust Recognition of Grasping Postures with a Machine Learning Approach for Low-Cost Hand Control
Abstract
:1. Introduction
- The use of public datasets combined with laboratory datasets in the development of the ML technique;
- The computation of only two features from the sEMG signal for fast grasping posture detection in low-cost devices;
- The robustness of the ML technique versus electrode shift and the use of GPU-based implementations.
2. Related Work
3. Materials and Methods
3.1. ADL Grasping Postures
3.2. sEMG-Based HMI/HRI
3.3. Datasets
3.3.1. NinaPro Dataset 5
3.3.2. Experimental Data
- 10 healthy subjects
- ○
- 5 male, 5 female.
- ○
- 9 right-handed, 1 left-handed.
- The age ranged from 18 to 60 years old.
3.4. Data Preprocessing: Feature Extraction
- Prediction model features have poor ability in classification tasks.
- Time-dependence features do not outperform features in the energy and complexity group.
- When the number of features is greater than 2, the classification accuracy only has a slight increase.
3.5. Pattern Recognition (PR) Methods
3.5.1. Description of PR Techniques
3.5.2. Selection of the PR Technique
- Robustness against uncertainty in the signal;
- Generalization capabilities;
- Low memory expense (with low number of neurons in the hidden layer);
- Real-time performance of the trained model.
3.5.3. A Machine Learning Approach for Grasping Posture Recognition
- Test 1: Input vector: datasets and featuresInput data from the DS5 and the UJIdb were used for feature computation (10 features described in Section 3.4) and later used in the training and testing phases, thus obtaining a considerable amount of data. Globally, a total of 380,000 descriptors were calculated to train the network and 40,000 descriptor data were saved only for testing.The tests were run with the calculated features as input data from the different databases separately and jointly to identify the best results. In each test, the data from one subject were excluded from training and used in testing. All 385 combinations of 1, 2, 3, and 4 features were tested as input data.
- Test 2: Number of neurons of the hidden layer.Based on the results obtained in [49], the tests started with 100 neurons in the hidden layer, with this number being increased or decreased in amounts of 25 or 50 ranging from 50 to 300: {50, 75, 100, 125, 150, 200, 250, 300}.
- Test 3: Activation function of the hidden layerTests were performed with the following activation functions, described in Appendix C: {elliotsig, hardlim, hardlims, logsig, poslin, purelin, radbas, satlin, satlins, tansig}
- Test 4: Training functionTests were performed with the following training functions, described in Appendix C: {trainbfg, traincgb, traincgf, traincgp, traingd, traingda, traingdm, traingdx, trainoss, trainrp, trainscg}
- Test 5: Weights of neurons in the hidden layerTests were made with and without their initialization to the best solution to assess the NN performance.Also, the following additional tests were performed:
- Test 6. Data window and overlapping sizesThe sliding window approach was used as it allowed us to increase the size of the training set (data augmentation) and it induces robustness to noise in the learned model, which leads to a better generalization.
- Test 7. Robustness against HRI displacementDifferent groups of input data from the DS5 and the UJIdb were tested, which implied the use of different HRIs: {DS5 upper Myo, DS5 lower Myo, DS5 upper Myo + UJIdb, DS5 lower Myo + UJIdb, DS5 upper Myo + DS5 lower Myo + UJIdb}. In each one of these databases, data from one subject were excluded from training and used in testing. The 10 features described in Section 3.4 were computed for all these data. All 385 combinations of 1, 2, 3, and 4 features were tested as input data.
- Test 8. Multiple neural networksThe efficacy of using three smaller NNs instead of one to identify different grasping postures and choose the correct one using a voting system was tested.
- Test 9. ParallelizationTests were made using the parallelization provided by MATLAB. This parallelization could imply the use of the CPU nodes and/or the use of the GPU nodes in parallel. Not all the functions provided by the Deep Learning Toolbox could be used by the GPU nodes.The results of these tests were assessed according to the following variables:
- MSE (Mean Squared Error) among the NN outputs () and the desired values (targets, ) for each EMG sensor and each posture. This value was calculated for each NN with the subjects’ data specifically separated for the test phase and it was slightly higher than the one obtained in MATLAB, which was obtained from randomly isolating data from the different subjects used for training.
- RMSE (Root Mean Square Error) among the NN outputs and the targets for each EMG sensor and each posture, calculated as the square root of the MSE value.
- Error percentage, which is the percentage of times in which the output and the target are not the same for a given sample. For this purpose, the posture with more probability was considered the selected one.
- Performance value, which is the value of the performance function of the NN that can be selected among those detailed in Appendix C, Table A3. It was set to MSE. The final value of the performance function will not be the same as the average MSE, since the performance value is computed using the test data automatically selected by the NN, while the actual MSE is calculated with a test subject not entered in the network. The main difference is that the NN saves random data for testing purposes and not only from one subject, as is desired to evaluate the prediction capabilities. Therefore, the performance value will, in general, be smaller than the MSE value due to this difference.
- Confusion matrix, which indicates the times an output is equal or different from the target for each posture. Diagonal values indicate the number of successful outcomes (output = target), while off-diagonal values indicate misassignments (output ≠ target). The best confusion matrix would be a diagonal matrix, as each posture would be recognized appropriately. This matrix allows us to identify which grasps have similar EMG signals.
4. Results
- Test Results 1: Input vector: datasets and featuresThe better RMSE results of the different NNs ranged from 26.96% to 27.46% for combinations of the features MAV, Skewness, activity, complexity and IEMG. The rest of the combinations produced higher RMSE results (reaching 38.65%).
- Test Results 2: Number of neurons of the hidden layer.With the progressive increase in the number of neurons in the hidden layer, a progressive increase in the mean RMSE was observed. In fact, a minimum mean RMSE value was identified for around 100 neurons in the different tests, which is consistent with previous experimentation [49]. Therefore, this number of neurons is generally adequate for the type of architecture and problem proposed.
- Test Results 3: Activation function of the hidden layerThe activation functions hardlim, hardlims, and logsig provided very high average RMSE values, implying high error in the outcomes. Furthermore, it was observed that the purelin activation function could not achieve many results due to the shape of the function. This is consistent with the fact that the rectified linear transfer function and enhanced versions are recommended for multi-layer perceptron [51].
- Test Results 4: Training functionThe training functions traingd, traingda, traingdm, traingdx, and trainrp are not adequate for the proposed architecture due to the very high RMSE values of performance in all cases.
- Test Results 5: Weights of neurons in the hidden layerThis test provided similar RMSE values for networks with and without the weights of the hidden layer initialized with the best solution.
- Test Results 6. Data window and overlapping sizesThe features were calculated in a window of 52 data (260 ms) with an overlap of 47 data (235 ms), as in [28]. The pair {52, 47} of {data window, data overlap} produced the best RMSE values in general. This size of overlap is small enough (5 values) to maintain the characteristics of the signal in the recalculation.
- Test Results 7. Robustness against HRI displacementSimilar results have been obtained with all the configurations, with the lower Myo slightly better than with the original configuration. Therefore, the algorithm is robust against variations in the position and orientation of the collection device since the different grips can also be recognized with similar RMSE value (26–27%).
- Test Results 8. Multiple neural networksThe use of multiple NNs provided higher RMSE values (around 38%) than the initial architecture. Therefore, the use of several NNs for grasping classification is not recommended.
- Test Results 9. ParallelizationThe use of parallelization on the GPU showed some difficulties: the need to relocate the data in a specific gpuArray data type with less precision than the original data, and the fact that some activation and training functions cannot be used on the GPU. Still, the parallelized training in both devices, the CPU and the GPU, was mandatory, since it made it possible to perform the tests described in this work.
5. Discussion
6. Conclusions and Further Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Data Acquisition Protocol
- The subject sits on a table and the operator places the Myo armband on the right subject’s forearm.
- The subject synchronizes the Myo armband with the computer, following the operator’s instructions.
- The data collection parameters are introduced by the operator: time (5 s), subject’s ID, and grasps to be performed. The operator thoroughly explains to the subject how to carry out each grasp.
- The operator selects the grasp, locates the object in the initial position (B), presses the “Start” button, and waits for the image of the selected grasp to appear in the left-bottom corner of the data collection application (Figure 5).
- Once the grasp image appears, the user initiates the pre-grasping motion with the right hand from the rest position (A) on the table to the initial object position (B), lifts the object, stays still until the image disappears (in the meantime, the operator captures the “Grasping posture” by pressing the corresponding button), and releases the object in the final object position (C). These positions are shown in Figure A1.
- The subject returns the hand to the rest position on the table.
- Steps 4, 5, and 6 are repeated 3 times.
- To continue the experiments, go to step 4. Otherwise, continue to step 9.
- The operator removes the Myo armband from the subject.
Appendix B. Selected Features
- 1.
- Mean Absolute Value (MAV). It is the mean of the absolute value of the sEMG signal (fully rectified signal). This descriptor is commonly used for EMG-based monitoring applications as it can detect muscle contraction levels.
- 2.
- Integrated EMG (IEMG). It is the sum of the fully rectified signal, and it is related to the EMG signal sequence firing point [46].
- 3.
- Root Mean Square (RMS). This parameter is closely related to the standard deviation, since both are equal when the mean of the signal is zero. It is calculated with the following equation:
- 4.
- Waveform Length (WL). It is a simple characterization of the waveform, since it is the accumulated length of the waveform in a given time.
- 5.
- Zero Crossing (ZC): It reflects the frequency at which the signal passes through zero, that is, the number of times the signal crosses the y-axis. The values of the signal collected through the Myo armband provide values between −128 and 128, centered in zero.
- 6.
- Slope Sign Changes (SSC): It measures how often the sign of the signal slope changes. The number of changes between a positive or a negative slope in three consecutive samples is computed, where a threshold parameter (ε) is included to avoid the noise in the sEMG signal.
- 7.
- Skewness (Sk): This parameter is the third central moment of a distribution and measures the asymmetry of a distribution. It is computed using Equation (A5), where is the mean value and is the standard deviation of the data window. The result of the skew can be positive or negative.
- 8.
- Activity (Ac): the surface of the power spectrum in the frequency domain. It is calculated as the variance in the signal:
- 9.
- Mobility (Mob): the representation of the mean frequency of the signal, where is the first derivative of the signal with respect to time in the i-th window.
- 10.
- Complexity (Cx): this represents the change in the frequency of the signal.
Appendix C. Activation, Training, and Performance Functions for NN in MATLAB
Activation Function | Description |
---|---|
compet | Competitive transfer function. |
elliotsig | Elliot sigmoid transfer function. |
hardlim | Positive hard limit transfer function. |
hardlims | Symmetric hard limit transfer function. |
logsig | Logarithmic sigmoid transfer function. |
netinv | Inverse transfer function. |
poslin | Positive linear transfer function. |
purelin | Linear transfer function. |
radbas | Radial basis transfer function. |
radbasn | Radial basis normalized transfer function. |
satlin | Positive saturating linear transfer function. |
satlins | Symmetric saturating linear transfer function. |
softmax | Soft max transfer function. |
tansig | Symmetric sigmoid transfer function. |
tribas | Triangular basis transfer function |
Training Function | Acronym | Learning Algorithm |
---|---|---|
trainlm | LM | Levenberg–Marquardt |
trainbr | BR | Bayesian Regularization |
trainbfg | BFGS | BFGS Quasi-Newton |
trainrp | RP | Resilient Backpropagation |
trainscg | SCG | Scaled Conjugate Gradient |
traincgb | CGB | Conjugate Gradient with Powell/Beale Restarts |
traincgf | CGF | Fletcher–Powell Conjugate Gradient |
traincgp | CGP | Polak–Ribiére Conjugate Gradient |
trainoss | OSS | One Step Secant |
traingdx | GDX | Variable Learning Rate Gradient Descent |
traingdm | GDM | Gradient Descent with Momentum |
traingd | GD | Gradient Descent backpropagation |
Performance Function | Description |
---|---|
mae | Mean Absolute Error |
mse | Mean Squared Normalized Error |
sae | Sum Absolute Error |
sse | Sum Square Error |
crossentropy | Cross-entropy Calculation |
msesparse | MSE with L2 weight and dispersion regulators |
References
- Vergara, M.; Sancho-Brú, J.L.; Gracia-Ibáñez, V.; Pérez-González, A. An introductory study of common grasps used by adults during performance of activities of daily living. J. Hand Ther. 2014, 3, 225–234. [Google Scholar] [CrossRef]
- Llop-Harillo, I.; Pérez-González, A.; Starke, J.; Asfour, T. The anthropomorphic hand assessment protocol (AHAP). Robot. Auton. Syst. 2019, 121, 103259. [Google Scholar] [CrossRef]
- Maheu, V.; Archambault, P.S.; Frappier, J.; Routhier, F. Evaluation of the JACO robotic arm: Clinico-economic study for powered wheelchair users with upper-extremity disabilities. In Proceedings of the IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; pp. 1–5. [Google Scholar]
- Espinosa, M.; Nathan-Roberts, D. Understanding Prosthetic Abandonment. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2019, 63, 1644–1648. [Google Scholar] [CrossRef]
- Smail, L.C.; Neal, C.; Wilkins, C.; Packham, T.L. Comfort and function remain key factors in upper limb prosthetic abandonment: Findings of a scoping review. Disabil. Rehabil. Assist. Technol. 2021, 16, 821–830. [Google Scholar] [CrossRef] [PubMed]
- Blana, D.; Kyriacou, T.; Lambrecht, J.M.; Chadwick, E.K. Feasibility of using combined EMG and kinematic signals for prosthesis control: A simulation study using a virtual reality environment. J. Electromyogr. Kinesiol. 2016, 29, 21–27. [Google Scholar] [CrossRef] [PubMed]
- Purushothaman, G. Myoelectric control of prosthetic hands: State-of-the-art review. Med. Devices Evid. Res. 2016, 9, 247–255. [Google Scholar]
- Ramos-Murguialday, A.; Schürholz, M.; Caggiano, V.; Wildgruber, M.; Caria, A.; Hammer, E.M.; Halder, S.; Birbaumer, N. Proprioceptive feedback and brain computer interface (BCI) based neuroprostheses. PLoS ONE 2012, 7, e47048. [Google Scholar] [CrossRef]
- Gonzalez-Vargas, J.; Dosen, S.; Amsuess, S.; Yu, W.; Farina, D. Human-Machine Interface for the Control of Multi-Function Systems Based on Electrocutaneous Menu: Application to Multi-Grasp Prosthetic Hands. PLoS ONE 2015, 10, e0127528. [Google Scholar] [CrossRef]
- Ortiz-Catalán, M.; Håkansson, B.; Brånemark, R. An osseointegrated human-machine gateway for long-term sensory feedback and motor control of artificial limbs. Sci. Transl. Med. 2014, 6, 257re6. [Google Scholar] [CrossRef]
- Szkopek, J.; Redlarski, G. Artificial-Hand Technology—Current State of Knowledge in Designing and Forecasting Changes. Appl. Sci. 2019, 9, 4090. [Google Scholar] [CrossRef]
- Shadow Robots. Available online: https://www.shadowrobot.com/dexterous-hand-series/ (accessed on 20 April 2023).
- Liu, H.; Wu, K.; Meusel, P.; Seitz, N.; Hirzinger, G.; Jin, M.H.; Liu, Y.W.; Fan, S.W.; Chen, Z.P. Multisensory Five-Finger Dexterous Hand: The DLR/HIT Hand II. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System, Nice, France, 22–26 September 2008; pp. 3692–3697. [Google Scholar]
- Shang, W.; Song, F.; Zhao, Z.; Gao, H.; Cong, S.; Li, Z. Deep Learning Method for Grasping Novel Objects Using Dexterous Hands. IEEE Trans. Cybern. 2022, 52, 2750–2762. [Google Scholar] [CrossRef] [PubMed]
- Bai, Q.; Li, S.; Yang, J.; Song, Q.; Li, Z.; Zhang, X. Object Detection Recognition and Robot Grasping Based on Machine Learning: A Survey. IEEE Access 2020, 8, 181855–181879. [Google Scholar] [CrossRef]
- Billard, A.; Kragic, D. Trends and challenges in robot manipulation. Science 2019, 364, eaat8414. [Google Scholar] [CrossRef] [PubMed]
- Bi, L.; Feleke, A.; Guan, C. A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration. Biomed. Signal Process. Control 2019, 51, 113–127. [Google Scholar] [CrossRef]
- Gyles, C. Robots in medicine. Can. Vet. J. 2019, 60, 819–820. [Google Scholar] [PubMed]
- Liu, L.; Yang, S.H.; Wang, Y.; Meng, Q. Home Service Robotics. Meas. Control 2009, 42, 12–17. [Google Scholar] [CrossRef]
- Open Hand Project. Available online: https://www.youtube.com/@OpenhandprojectOrg (accessed on 22 March 2024).
- Enabling the Future. Available online: https://enablingthefuture.org/ (accessed on 22 April 2023).
- Amaral, P.; Silva, F.; Santos, V. Recognition of Grasping Patterns Using Deep Learning for Human–Robot Collaboration. Sensors 2023, 23, 8989. [Google Scholar] [CrossRef]
- Mim, T.R.; Amatullah, M.; Afreen, S.; Yousuf, M.A.; Uddin, S.; Alyami, S.A.; Hasan, K.F.; Moni, M.A. GRU-INC: An inception-attention based approach using GRU for human activity recognition. Expert Syst. Appl. 2023, 216, 119419. [Google Scholar] [CrossRef]
- Calado, A.; Soares, F.; Matos, D. A Review on Commercially Available Anthropomorphic Myoelectric Prosthetic Hands, Pattern-Recognition-Based Microcontrollers and sEMG Sensors used for Prosthetic Control. In Proceedings of the IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Porto, Portugal, 24–26 April 2019; pp. 1–6. [Google Scholar]
- Phinyomark, A.; Scheme, E. EMG Pattern Recognition in the Era of Big Data and Deep Learning. Big Data Cogn. Comput. 2018, 2, 21. [Google Scholar] [CrossRef]
- Padmanabhan, P.; Puthusserypady, S. Nonlinear Analysis of EMG Signals—A Chaotic Approach. In Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Francisco, CA, USA, 1–5 September 2004; Volume 1, pp. 608–611. [Google Scholar]
- Phinyomark, A.; Quaine, F.; Charbonnier, S.; Serviere, C.; Tarpin-Bernard, F.; Laurillau, Y. EMG Feature Evaluation for Improving Myoelectric Pattern Recognition Robustness. Expert Syst. Appl. 2013, 40, 4832–4840. [Google Scholar] [CrossRef]
- Cote-Allard, U.; Fall, C.L.; Drouin, A.; Campeau-Lecours, A.; Gosselin, C.; Glette, K.; Laviolette, F.; Gosselin, B. Deep Learning for Electromyographic Hand Gesture Signal Classification Using Transfer Learning. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 760–771. [Google Scholar] [CrossRef] [PubMed]
- Neacsu, A.A.; Cioroiu, G.; Radoi, A.; Burileanu, C. Automatic EMG-based Hand Gesture Recognition System using Time-Domain Descriptors and Fully-Connected Neural Networks. In Proceedings of the 42nd International Conference on Telecommunications and Signal Processing (TSP), Budapest, Hungary, 1–3 July 2019; pp. 232–235. [Google Scholar]
- Batzianoulis, I.; Krausz, N.E.; Simon, A.M.; Hargrove, L.; Billard, A. Decoding the grasping intention from electromyography during reaching motions. J. Neuroeng. Rehabil. 2018, 15, 57. [Google Scholar] [CrossRef] [PubMed]
- Purushothaman, G.; Vikas, R. Identification of A Feature Selection Based Pattern Recognition Scheme for Finger Movement Recognition from Multichannel EMG Signals. Aust. Phys. Eng. Sci. Med. 2018, 41, 549–559. [Google Scholar] [CrossRef] [PubMed]
- Cutkosky, M.R. On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Trans. Rob. Autom. 1989, 5, 269–279. [Google Scholar] [CrossRef]
- Feix, T.; Romero, J.; Schmiedmayer, H.-B.; Dollar, A.M.; Kragic, D. The GRASP Taxonomy of Human Grasp Types. IEEE Trans. Hum. -Mach. Syst. 2016, 46, 66–77. [Google Scholar] [CrossRef]
- Edwards, S.; Buckland, D.; McCoy-Powlen, J. Developmental and Functional Hand Grasps; Slack Incorporated: San Francisco, CA, USA, 2002. [Google Scholar]
- Kilbreath, S.L.; Heard, R.C. Frequency of hand use in healthy older persons. Aust. J. Physiother. 2005, 51, 119–122. [Google Scholar] [CrossRef] [PubMed]
- Merletti, R.; Parker, P.A. Detection and Conditioning of the Surface EMG Signal. In Electromyography: Physiology, Engineering, and Non-Invasive Applications; Wiley-IEEE Press: Hoboken, New Jersey, USA, 2004; pp. 107–131. [Google Scholar]
- Myo Armband. Thalmics Lab. Available online: https://github.com/thalmiclabs (accessed on 25 April 2023).
- Atzori, M.; Gijsberts, A.; Castellini, C.; Caputo, B.; Hager, A.G.; Elsig, S.; Giatsidis, G.; Bassetto, F.; Müller, H. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Sci. Data 2014, 1, 140053. [Google Scholar] [CrossRef]
- Pizzolato, S.; Tagliapietra, L.; Cognolato, M.; Reggiani, M.; Müller, H.; Atzori, M. Comparison of six electromyography acquisition setups on hand movement classification tasks. PLoS ONE 2017, 12, e0186132. [Google Scholar] [CrossRef]
- Atzori, M.; Gijsberts, A.; Müller, H.; Caputo, B. Classification of hand movements in amputated subjects by sEMG and accelerometers. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 3545–3549. [Google Scholar]
- Calli, B.; Walsman, A.; Singh, A.; Srinivasa, S.; Abbeel, P.; Dollar, A.M. Benchmarking in Manipulation Research: Using the Yale-CMU-Berkeley Object and Model Set. IEEE Robot. Autom. Mag. 2015, 22, 36–52. [Google Scholar] [CrossRef]
- Calli, B.; Walsman, A.; Singh, A.; Srinivasa, S.; Abbeel, P.; Dollar, A.M. The YCB Object and Model Set: Towards Common Benchmarks for Manipulation Research. In Proceedings of the 2015 IEEE International Conference on Advanced Robotics (ICAR), Istanbul, Turkey, 27–31 July 2015. [Google Scholar]
- Calli, B.; Singh, A.; Bruce, J.; Walsman, A.; Konolige, K.; Srinivasa, S.; Abbeel, P.; Dollar, A.M. Yale-CMU-Berkeley dataset for robotic manipulation research. Int. J. Robot. Res. 2017, 36, 261–268. [Google Scholar] [CrossRef]
- YCB Benchmarks—Object and Model Set. Available online: https://www.ycbbenchmarks.com/ (accessed on 23 April 2023).
- Matlab Software, Mathworks. Available online: https://es.mathworks.com/products/matlab.html (accessed on 26 April 2023).
- Phinyomark, A.; Phukpattaranont, P.; Limsakul, C. Feature reduction and selection for EMG signal classification. Expert Syst. Appl. 2012, 39, 7420–7431. [Google Scholar] [CrossRef]
- Hudgins, B.; Parker, P.; Scott, R. A new strategy for multifunction myoelectric control. IEEE Trans. Biomed. Eng. 1993, 40, 82–94. [Google Scholar] [CrossRef]
- Karlik, B. Machine Learning Algorithms for Characterization of EMG Signals. Int. J. Inf. Electron. Eng. 2014, 4, 189–194. [Google Scholar] [CrossRef]
- Mora, M.C.; Sancho-Bru, J.L.; Pérez-González, A. Hand Posture Prediction Using Neural Networks within a Biomechanical Model. Int. J. Adv. Robot. Syst. 2012, 9, 139. [Google Scholar] [CrossRef]
- Theodoridis, S. Chapter 18—Neural Networks and Deep Learning. In Machine Learning: A Bayesian and Optimization Perspective, 2nd ed.; Academic Press: Cambridge, MA, USA, 2020; pp. 901–1038. [Google Scholar]
- Cerdá-Boluda, J.; Gadea-Gironés, R. Chapter 10—Xarxes neuronals. In Introducció als Sistemes Complexos, als Autòmats cel·Lulars i a les Xarxes Neuronals, 1st ed.; Universitat Politècnica de València: Valencia, Spain, 2009; pp. 325–400. [Google Scholar]
- The Mathworks, Documentation. Traincgp, Conjugate Gradient Backpropagation with Polak-Ribiére Updates. Available online: https://uk.mathworks.com/help/deeplearning/ref/traincgp.html (accessed on 7 March 2024).
- Fletcher, R.; Reeves, C.M. Function minimization by conjugate gradients. Comput. J. 1964, 7, 149–154. [Google Scholar] [CrossRef]
- Hagan, M.T.; Demuth, H.B.; Beale, M.H. Neural Network Design; PWS Publishing: Boston, MA, USA, 1996. [Google Scholar]
- Khushaba, R.N.; Kodagoda, S. Electromyogram (EMG) feature reduction using mutual components analysis for multifunction prosthetic fingers control. In Proceedings of the 12th IEEE International Conference on Control Automation Robotics & Vision (ICARCV), Guangzhou, China, 5–7 December 2012; pp. 1534–1539. [Google Scholar]
- Baheti, P. 12 Types of Neural Network Activation Functions: How to Choose? Available online: https://www.v7labs.com/blog/neural-networks-activation-functions (accessed on 19 October 2023).
- The Mathworks, Documentation. Choose a Multilayer Neural Network Training Function. Available online: https://uk.mathworks.com/help/deeplearning/ug/choose-a-multilayer-neural-network-training-function.html (accessed on 19 October 2023).
- The Mathworks, Documentation. Patternnet. Available online: https://uk.mathworks.com/help/deeplearning/ref/patternnet.html (accessed on 19 October 2023).
ID | Grasp Name | Description |
---|---|---|
PP | Pulp Pinch | Only the thumb and the tip of the fingers are used. Unused fingers may be in extension or flexion. |
LP | Lateral Pinch | The lateral part of the fingers (one or more) is used, and usually the thumb too. The rest of the fingers are flexed. |
DV | Diagonal Volar Grip | Variant of the cylindrical grip when the object is not parallel to the palm. In this case, the thumb is abducted, parallel to the object. |
Cyl | Cylindrical Grip | The palm is involved during the grasp as it touches the object and is arched. The thumb is in direct opposition to the rest of the fingers. |
Ext | Extension Grip | The thumb and proximal part of the fingers are involved in the grasp, but the palm is not. |
Trip | Tripod Pinch | The thumb and two more fingers are used, being able to use the tip or the side of the latter. |
Sph | Spherical Grip | The hand curves to hold the object with all the fingers abducted and with the intervention of the palm. |
Hk | Hook | The palm and the thumb are not involved in the grip since the entire weight of the object is held by the fingers. |
Rs | Rest | The fingers and the palm are not exerting any force. |
Grasp Type | ID in Ninapro DS5, Exercise C | Objects |
---|---|---|
Pulp Pinch | 6 | Small Water Bottle |
15 | Coin | |
Lateral Pinch | 14 | Coin |
Diagonal Volar Grip | 8 | Pencil |
17 | Card | |
Cylindrical Grip | 1 | Large Bottle |
2 | Handle | |
Extension grip | 18 | Book |
19 | Plate | |
Tripod pinch | 9 | Marker |
13 | Tennis Ball | |
Spherical grip | 10 | Tennis Ball |
Hook | 3 | Tall Glass |
Rest | 0 | None |
Neural Network ID | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Type of architecture | Feed-forward neural network with a hidden layer | |||
Number of features as inputs | 2 × 8 EMG 16 input arrays | 3 × 8 EMG 24 input arrays | 3 × 8 EMG 24 input arrays | 4 × 8 EMG 32 input arrays |
Descriptors/features | MAV & Sk | MAV & Ac & Sk | Cx & IEMG & Sk | Ac & Cx & IEMG & Sk |
Mean RMSE | 0.2696 | 0.2706 | 0.2717 | 0.2706 |
Training function | traincgp | traincgb | trainoss | trainscg |
Neurons in the hidden layer | 150 | 120 | 150 | 75 |
Activation func. (hidden layer) | poslin | satlins | elliotsig | radbas |
Activation func. (output layer) | softmax | softmax | softmax | softmax |
Performance function | mse | mse | mse | mse |
Grasp Type | MSE | RMSE |
---|---|---|
Pulp pinch | 0.0941 | 0.3068 |
Lateral pinch | 0.0610 | 0.2470 |
Diagonal volar grip | 0.1042 | 0.3229 |
Cylindrical grip | 0.0987 | 0.3141 |
Extension grip | 0.0794 | 0.2818 |
Tripod pinch | 0.0958 | 0.3096 |
Spherical grip | 0.0754 | 0.2747 |
Hook | 0.0716 | 0.2676 |
Rest | 0.0105 | 0.1022 |
Total | 0.0768 ± 0.0287 | 0.2696 ± 0.0675 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mora, M.C.; García-Ortiz, J.V.; Cerdá-Boluda, J. sEMG-Based Robust Recognition of Grasping Postures with a Machine Learning Approach for Low-Cost Hand Control. Sensors 2024, 24, 2063. https://doi.org/10.3390/s24072063
Mora MC, García-Ortiz JV, Cerdá-Boluda J. sEMG-Based Robust Recognition of Grasping Postures with a Machine Learning Approach for Low-Cost Hand Control. Sensors. 2024; 24(7):2063. https://doi.org/10.3390/s24072063
Chicago/Turabian StyleMora, Marta C., José V. García-Ortiz, and Joaquín Cerdá-Boluda. 2024. "sEMG-Based Robust Recognition of Grasping Postures with a Machine Learning Approach for Low-Cost Hand Control" Sensors 24, no. 7: 2063. https://doi.org/10.3390/s24072063
APA StyleMora, M. C., García-Ortiz, J. V., & Cerdá-Boluda, J. (2024). sEMG-Based Robust Recognition of Grasping Postures with a Machine Learning Approach for Low-Cost Hand Control. Sensors, 24(7), 2063. https://doi.org/10.3390/s24072063