Next Article in Journal
Evaluation of Neuromuscular Fatigue According to Injury History in a Repeat Sprint Ability Test, Countermovement Jump, and Hamstring Test in Elite Female Soccer Players
Next Article in Special Issue
Early Ventricular Fibrillation Prediction Based on Topological Data Analysis of ECG Signal
Previous Article in Journal
Analytical Research on the Impact Test of Light Steel Keel and Lightweight Concrete of Composite Wall
Previous Article in Special Issue
Blood Pressure Estimation by Photoplethysmogram Decomposition into Hyperbolic Secant Waves
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

sEMG Signals Characterization and Identification of Hand Movements by Machine Learning Considering Sex Differences

College of Biomedical Engineering, Taiyuan University of Technology, Taiyuan 030024, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(6), 2962; https://doi.org/10.3390/app12062962
Submission received: 28 November 2021 / Revised: 24 February 2022 / Accepted: 10 March 2022 / Published: 14 March 2022

Abstract

:
Developing a robust machine-learning algorithm to detect hand motion is one of the most challenging aspects of prosthetic hands and exoskeleton design. Machine-learning methods that considered sex differences were used to identify and describe hand movement patterns in healthy individuals. To this purpose, surface Electromyographic (sEMG) signals have been acquired from muscles in the forearm and hand. The results of statistical analysis indicated that most of the same muscle pairs in the right hand (females and males) showed significant differences during the six hand movements. Time features were used an as input to machine-learning algorithms for the recognition of six gestures. Specifically, two types of hand-gesture recognition methods that considered sex differences(differentiating sex datasets and adding a sex label)were proposed and applied to the k-nearest neighbor (k-NN), support vector machine (SVM) and artificial neural network (ANN) algorithms for comparison. In addition, a t-test statistical analysis approach and 5-fold cross validation were used as complements to verify whether considering sex differences could significantly improve classification performance. It was demonstrated that considering sex differences can significantly improve classification performance. The ANN algorithm with the addition of a sex label performed best in movement classification (98.4% accuracy). In the future, hand movement recognition algorithms considering sex differences could be applied to control systems for prosthetic hands or exoskeletons.

1. Introduction

The human hand is an amazingly precise and agile apparatus, used to perform actions that range from delicate and intricate to forceful and strenuous [1]. Dysfunction and partial loss of the hand could deeply impact many activities in daily life. Of the 18,500 upper limb amputations that occur each year in the USA, 91% develop at the distal wrist [2,3]. Prosthetic devices are one kind of better method that could meet the challenges faced by these people in their daily life, and very robust and flexible devices have not been made available to the majority of people. Moreover, most of today’s devices can only perform a small set of fixed hand-gestures that do not fully imitate the hands of human beings. Currently, more advanced commercial prosthetic equipment typically performs hand and grip positioning [4] with independent control of only one finger, such as the i-Limb Ultra hand of Touch Bionics, the Michelangelo hand of Ottobock and the Bebionic 3 hand of RSL Steeper. Approximately 100–200 out of 100,000 people worldwide are severely affected by hand function after a stroke each year [5,6], with close to 80% developing upper limb impairment [7]. The impairment in performing various tasks may affect the independence of the patients and lead to long-term disability. Recent advances in rehabilitation of the hand have shown that robot-assisted therapy can assist in restoring motor function [8,9,10,11,12,13]. The exoskeleton device could support fingers and hands and also could provide direct control of the hand joints. The functions of exoskeletons range from knob movements and assistance of broad grasp, such as the Haptic-Knob and HWARD devices [14,15], to highly specialized devices for controlling fingers and hand movements, such as the CyberGrasp, the Hand-of-Hope [16], the HEXXOR [17], the Soft RoboticGlove [18], the Gloreha [19] and the ReHand [20]. The control system of the prosthetic hand and exoskeleton can drive the system response through their own electrical signals [21]. The electroencephalography (EEG) and electromyography (EMG) signals both can directly reflect the patients’ control and intention for movement. In particular, surface electromyography (sEMG) is valuable in the study of prosthetic hands and assisted rehabilitation robots [22,23] due to its advantages of simple acquisition and processing, wireless sensing, wearable electrodes and the ability to give information on movement intention 50–100 ms before the actual movement [24]. For instance, the surface electrodes of two muscles on the forearm, the extensor digitorum communis and the flexor digitorum superficialis, can be used to partly record the flexion and extension intentions of the fingers [25]. However, in practical applications, sEMG signals are highly dependent on the instrumentation, methods and procedures used in the system. At the same time, the signal acquisition process generates different degrees of artifacts and crosstalk [25].
Thus, robust and accurate machine learning methods need to be developed for detecting hand movements to control exoskeletons and prosthetic hands in real time. In previous studies, features or characteristics in the time domain, frequency domain and time–frequency domain had been used to detect hand motion. However, the time-domain features were considered to perform better than the frequency domain or time–frequency domain features [26,27]. In the time domain, the most used features are: Mean Absolute Value (MAV), Waveform Length (WL), Willison Amplitude (WAMP) and Zero-Crossing (ZC) [28,29,30]. Common gestures were classified by Carl et al. [30], using only time-domain feature values, and the accuracy can be up to 90.57%. Due to the high redundancy of time-domain features, dimensionality reduction techniques, the principal component analysis, for example, have been widely applied to improve the results [27]. On the other hand, multiple gesture pattern recognition algorithms based on surface EMG signals have been proposed in previous studies. Ali et al. [31] used a 12-channelsEMG signal and linear discriminant analysis (LDA) to classify 15 different finger movements with 11 time-domain feature values as inputs for 10 non-amputees and 6 sub-elbow amputees with an accuracy of more than 90%. Mohammadreza et al. [32] studied the application of the support vector machine (SVM) algorithm in the classification of myoelectric control systems. Ganesh et al. [27] used 11-channel EMG signals to classify 11 gestures with the help of Independent Component Analysis (ICA) and Icasso clustering with 9 time-domain features values as input with an accuracy of 96.6%. Ariyanto et al. [33] decoded five finger movements of 11 healthy subjects for 16 time-domain feature signals using an artificial neural network (ANN) with an accuracy of over 96%. In the process of gesture pattern recognition, more than nine feature values per EMG signal are frequently used as input to the classifier. The use of more feature values does improve the robustness of the classification, but it also generates high redundancy and increases the computing time of the control system, which in turn leads to a delay in the control system. Therefore, it is important to reduce the number of feature values of the EMG by a dimensionality reduction without reducing the robustness of the classification system.
In addition, although a variety of machine learning algorithms had been proposed in previous studies, the three classifiers with the best classification results are k-Nearest Neighbor (k-NN), Support Vector Machines (SVM) and Artificial Neural Networks (ANN). Furthermore, although the results for finger movement classification appear promising, many previous studies had only examined the use of EMG signal feature values as classification inputs and had not considered the male and female muscle differences and sEMG signals on the impact on classification results. Sex-related differences in kinematics [34] and EMG [35] signals were evident in dynamic manual manipulation tasks of the upper limbs. In addition, Hunter et al. [36], comparing the similarity in muscle fatigue levels between strength-matched adult males and females, found that EMG activity differed between males and females, with a lower rate of increase in mean-corrected EMG in female muscles compared to males. Kent-Braun et al. [37], examining age and gender differences in human skeletal muscle responses during incremental isometric exercise, found that, during exercise, Intracellular concentrations of Pi and H2PO4 increased more, and pH decreased more in males compared to females. Manjuanth et al. [38] and Kambayashi et al. [39] also demonstrated sex differences in muscle EMG and muscle oxygenation levels during upper limb exercise, respectively. We speculated that taking sex differences into account in the process of gestural pattern recognition may be a breakthrough in improving the robustness and accuracy of pattern recognition.
In this study, two methods through machine learning were proposed to identify and characterize six hand movements in healthy individuals using multichannel sEMG signals, considering sex differences in muscles. For this purpose, the sEMG signals of subjects were analyzed for feature extraction and statistical analysis (right hand muscles in males and females). The k-NN, SVM and ANN classifiers were also applied to compare whether considering sex differences in muscle significantly affects the classification results or not. It was hypothesized that machine learning methods that considered sex differences in muscle would have better classification results and higher average prediction accuracy.

2. Materials and Methods

2.1. Subjects

Twenty healthy right-handed volunteers (10 men, 10 females; age: 24.2 ± 2.8 years old; BMI: 21.7 ± 2.8) were recruited and agreed to participate in the study; they reported no visual impairment, neurological disease or upper limb musculoskeletal trauma. All subjects’ handedness was determined by the Edinburgh Handedness Inventory. All experimental protocols and methods were carried out in accordance with relevant guidelines and regulations and were approved by the Bio medical Ethics Committee of Taiyuan University of Technology. Subjects have been fully informed of the purpose and procedure of the study and have signed a written informed consent form.

2.2. sEMG Signal Acquisition and Preprocessing

Figure 1 details the experimental protocol for acquiring sEMG signals in six different hand gestures. On the basis of previous studies [21,40], 6 surface electrodes (Ag/AgCl) were applied to 6 muscles of forearm and hand: abductor pollicis brevis (APB), flexor digitorum superficials (FDS), brachioradialis (BRA), flexor carpi ulnaris (FCU), extensor carpiradialis(EC) and extensor digitorum communis (EDC).The sEMG signals from six muscles were simultaneously acquired and recorded by a Noraxon Ultium wireless EMG system with a sampling frequency of 2000 Hz. During the experiment, the subjects were asked to perform 20 repetitions for each of the 6 gestures, resting for 2–3 s between repetitions and 5 min between gestures.
For the acquisition and digitization of the signals, the commercial Noraxon Ultium EMG Sensor and MyoResearch software were used. A previous characterization of the sensor was required since the data-sheet did not provide detailed information of the sensor specifications. The sEMG sensor had a 16-bit gain-adjustable analog output, 300 ms fixed analog output delay, 1000 MΩ minimum input impedance and a maximum CMRR of 100 dB. EMG signal was digitized by 2 kHz sampling frequency and 24-bit ADC. A 20~500 Hz (FIR) band-pass filter, a 50 ms RMS smoothing filter and rectifier were used in the digital signal processing. The time domain feature values were calculated over an analysis window of the muscle activation state. The analysis window was 250 ms (contains the entire cycle of muscle activation). For each analysis window, a feature set was computed, and these features were provided to a pattern classifier. Once the EMG signal was filtered, we computed four features in time domains: Integral Electromyographic(iEMG), Mean Absolute Value(MAV), Input Contribution Rate(ICRi) and Variance(VAR), as detailed in Table 1.
The feature set for the right hand side was considered to contain 24 (6 × 4) features. For this huge feature set, two methods were used to reduce the number of features: dimensionality reduction and feature selection. Firstly, the ICRi value is the percentage of the iEMG value of each muscle in the sum of the iEMG values of the six muscles when the subject was asked to perform a certain movement. The ICRi value can be calculated from the iEMG therefore they can be excluded. Secondly, statistical analysis was performed using SPSS 23 (SPSS Inc., Chicago, IL, USA) to analyze the average correlation between the hand gestures and the time features. Of the three remaining time-feature values, only a strong correlation exists between iEMG and gestural patterns (APB: 0.622 **, FDS: 0.573 **, BRA: 0.488 **, FCU: 0.411 **, EC: 0.525 **, EDC: 0.475 **, ** represents p-value less than 0.01). Finally, it was decided to select 6-channel iEMG as the input signal of hand-gestures recognition.

2.3. Statistical Analysis

The statistical analysis was performed using SPSS 23 (SPSS Inc., Chicago, IL, USA). A one-way ANOVA was performed on iEMG values for the same movements and muscles to detect statistically significant differences between females and males. Meanwhile, the statistical significance of the three factors of sex, motion and muscle was determined by a three-way ANOVA followed by a Tukey test. Variables with statistically significant differences are denoted by * in this paper. The alpha level was set at 0.05.

2.4. Machine Learning for Gesture Discrimination

Machine learning method was used to recognize the sEMG signals of 6 different gestures. When a subject repeats a gesture, the signal obtained is called an event. Each subject had at least 20 events for each gesture, with an interval of 2–3 s. Overall, we had at least 2400 events in the whole experiment and at least 400 events per gesture (20 subjects × At least 20 repetition).
From the feature space, three types of classification algorithms were trained for the 6 hand gestures of interest: k-NN, SVM and ANN. In these supervised algorithms (kNN, SVM and ANN), 80% of the dataset was used as the training set of the algorithm for parameter tuning and model fitting; 20% of the dataset was used as the test set of the algorithm to perform the evaluation of the prediction performance of the algorithm. In addition, 10-fold cross validation and grid search methods were used to determine the optimal number of the nearest neighbors for the k-NN classifier and the optimal combination of kernel functions and penalty coefficients for the SVM classifier to achieve the optimal prediction of the classifier. On the other hand, for the ANN classifier, we use different training algorithms and activation functions varying between 2 and 3 hidden layers to achieve the optimal prediction of the classifier.
In particular, two types of machine learning algorithms that considered sex differences in muscles were proposed. One approach is the distinction of sex datasets. The iEMG values of surface EMG signals from 10 males and 10 females were divided into two different datasets for model training and prediction. At the same time, the iEMG values of a total of 10 subjects (5 males and 5 females) were randomly selected as the control dataset for model training and prediction. Another approach was to add a sex label to the total dataset. The sex label was added to the dataset and used as an additional feature value for classification prediction. The iEMG values of surface EMG signals from 10 males and 10 females were used as an overall dataset, with the dataset in which the sex label was added as the experimental group and the dataset in which the sex label was not added as the control group. The dataset names and dataset descriptions can be found specifically in Table 2. The optimal configurations of k-NN, SVM and ANN classifiers applied to different datasets identified by the training set are shown in Table 3 and Table 4.
In addition, the method of cross validation (CV) was used to ensure the validity of the results. CV is a method of dividing a dataset into two sets, a ‘training’ and a ‘sample’ set, for supervised learning. k-fold CV starts by obtaining k subsets with the same number of members from the dataset, with the members of each subset being shared randomly. In a classification process, each subset is used as the training set and all other subsets are used as the sample set. Therefore, the resulting prediction accuracy is calculated by averaging the k prediction accuracy rates. In this study, 30% of the dataset was used as the validation set for the algorithm for parameter tuning, and 70% of the dataset was used as the test set for the algorithm for 5-fold CV to assess prediction performance. The optimal configurations of k-NN, SVM and ANN classifiers applied to the different datasets determined by the validation set are shown in Table 5 and Table 6. Meanwhile, the statistical analysis of t-test was used to determine whether considering sex differences for different classifiers could significantly improve the prediction accuracy.

3. Results

3.1. Statistical Analysis of iEMG

Figure 2 shows the results of the statistical analysis of the iEMG values for six hand gestures for the same muscle pair in the right hand of the female and male. The results of the statistical analysis indicated that most of the same muscle pairs in the right hand (females and males) showed significant differences when performing the six hand movements.
In the three hand movements RM, FF and FK, iEMG values for all six muscles were significantly different between females and males. In the remaining three hand movements, TB, TI and HC, iEMG values for more than four muscles were significantly different between females and males. Furthermore, in all six hand movements, iEMG values were lower in female muscles than in males (p < 0.05), except for APB in TB and FK. This result proved that there is a significant difference in the right hand upper limb muscles between males and females in performing the six gestures.
Table 7 shows the statistical results from the three-way ANOVA for iEMG values of sex, motion and muscle factors. The sex factor, motion factor and muscle factor are named SEX, MT and MUSCLE respectively, in Table 5. The three-way ANOVA was run on a sample of iEMG values to examine the effect of sex, motion and muscle. There was a statistically significant three-way interaction between SEX, MT and MUSCLE (F = 15.504, p = 0.000). When calculating the two-way ANOVA, the MT by SEX (F = 13.192, p = 0.000), MT by MUSCLE (F = 594.537, p = 0.000) and SEX by MUSCLE (F = 50.537, p =0.000) are statistically significant. Meanwhile, MT (F = 1710.542, p = 0.000), SEX (F = 412.699, p = 0.000) and MUSCLE (F = 3374.416, p = 0.000) were all statistically significant at the time of the one-way ANOVA.The results of the Tukey’s HSD multiple comparison test are shown in Appendix A (Table A1 and Table A2). The application of Tukey’s HSD multiple comparison test showed statistically significant differences between all groups, except between MT-5 and MT-6. (A mean difference of “*” indicates a significant difference between groups).

3.2. sEMG Signal-Based Movement Classification

Based on surface EMG signal events, iEMG eigenvalues were calculated for six channels in this paper. The 80% of the dataset events were used to train the classifiers (kNN, SVM and ANN) so as to find the optimal parameters that minimize the error function between the estimation and the real label of each event, as shown in Table 3 and Table 4. The 20% of the dataset events were used to test the classifiers and evaluate its performance. In turn, it was verified whether two machine learning algorithms that consider sex muscle differences can improve prediction accuracy.
In this paper, the test set prediction confusion matrix (7 × 7) for different classifiers was built separately for the method of differentiating datasets and the method of adding sex labels, as shown in Figure 3 and Figure 4. The confusion matrices contain the percentage of all events correctly and incorrectly classified per hand-gesture, prediction accuracy, prediction coverage of per hand-gesture and the total prediction accuracy of the classifier. As shown in Figure 3, among the three classifiers (kNN, SVM and ANN), the sex-differentiated dataset (Female, Male) was significantly better than the non-sex-differentiated dataset (Half_F&M) in terms of prediction results. The ANN classifier had the best prediction results, with 98.3% for Female, 97.0% for Male and 96.5% for Half_F&M in the different datasets. As illustrated in Figure 4, in all three classifiers (kNN, SVM and ANN), the prediction results with the addition of the sex label (Sum_Label) were significantly better than those without the addition of the sex label (Sum). The best predictions were obtained for the ANN classifier, with predictions of 98.4% for Sum_Label and 97.9% for Sum. Figure 4 shows the overall prediction accuracy by aggregating the confusion matrices of the test sets in Figure 3 and Figure 4. For the kNN and SVM classifiers, the prediction accuracy of the differentiated dataset was higher than that of adding a sex label, as shown in Figure 5. For the ANN classifier, the prediction accuracy of adding a sex label was higher than that of differentiating-the-sex dataset. Overall, machine learning algorithms that consider sex muscle differences can improve the prediction accuracy of hand gesture recognition, with the ANN classifier adding a sex label having the highest prediction accuracy of 98.4%.
In addition, as indicated in Figure 5, by plotting the resultant histograms of prediction accuracy for two types of machine learning algorithms, the prediction accuracy of the classifier is significantly higher after differentiating between the sexes. For the kNN classifier, the differentiating-the-sex dataset method had a higher prediction accuracy of 95.5% for females and 95.6% for males, while the adding a sex labeling method had a prediction accuracy of 94%. For the SVM classifier, the prediction accuracy was similar for both methods, with the differentiating-the-sex dataset method having a slightly higher prediction accuracy of 96.3% for females and 93.9% for males, and the adding a sex labeling method having a prediction accuracy of 94.2%. For the ANN classifier, the adding a sex labeling method had a higher prediction accuracy of 98.4%, while the differentiating-the-sex dataset method had a prediction accuracy of 95.5% for females and 95.6% for males.
Furthermore, the results of the 5-fold CV and t-test were shown in Figure 6. For the method of sex dataset differentiation, the prediction accuracy for the Female dataset was significantly higher than that for the Half_F&M dataset for all of the kNN, SVM and ANN classifiers; the prediction accuracy for the Male dataset was significantly higher than that for the Half_F&M dataset for both the SVM and ANN classifiers. For the method of adding the sex label, the prediction accuracy of the kNN, SVM and ANN classifiers with the addition of the sex label was significantly higher than that of the non-addition. The prediction accuracy of the kNN classifier was significantly higher for the Female dataset than for the Half_F&M dataset in the sex differentiation dataset, while the mean prediction accuracy of the Male dataset was slightly higher than that of the Half_F&M dataset. Thus, the statistical results show that for the kNN classifier the sex differentiation dataset can significantly improve the prediction accuracy for females, while the prediction accuracy improvement for males is smaller. Further, it can be concluded that the method of sex differentiation dataset can improve the prediction accuracy on kNN classification. Overall, machine learning algorithms that considered sex muscle differences were able to significantly improve the prediction accuracy of gesture recognition, with ANN classifiers having a higher prediction accuracy than kNN and SVM classifiers.

4. Discussion

The hand gesture recognition algorithm is a promising pattern recognition algorithm, based on the sEMG signal-considered sex differences. In this study, it was shown that hand gesture recognition algorithms that consider sex differences have better classification predictions than traditional hand gesture recognition algorithms based on sEMG signals. These results have important implications for the development of robust machine learning algorithms for hand gesture recognition and thus for clinical applications in prosthetic hands and exoskeletons.

4.1. Pattern Recognition and Sex Differences

In this study, accuracy and classification timelines for assessing the performance of a myoelectric pattern recognition control system were focused on. In total, four time-domain features from the EMG signal of six muscles for 2400 events were extracted, which have been widely used in the literature for motion classification. Considering the transmission delay and hysteresis prevalent in myoelectric control systems, the four time-domain feature values were filtered by dimensionality reduction based on correlation analysis. We selected only the iEMG values for pattern classification to reduce the computation time of the feature extraction phase and the complexity of the classification model. Previous studies have reported hand movements recognition using a combination of multiple time-domain feature values [30] and multiple time–frequency domains [41], special values with a high number of channels and data redundancy of EMG signals. The average prediction accuracy they obtained was only 90.3% and 97.23%. Furthermore, previous studies have presented and demonstrated sex-related differences in kinematics [34] and EMG in dynamic [35] manual manipulation tasks of the upper extremity. However, few studies have applied sex factors to gesture pattern recognition based on EMG signals.
In this paper, considering sex differences in sEMG signals, the right upper limb muscles of the male and female were shown to be significantly different when performing the six gestures by statistical analysis (One-way ANOVA). Therefore, two types of machine learning algorithms that considered muscle sex differences have been proposed in the presented paper, the differentiation of sex datasets and the addition of a sex label. In particular, only the iEMG feature values of six channels per event were extracted for hand gesture recognition considering muscle sex differences. An average overall prediction accuracy of 98.4% was obtained for the ANN classifier with the addition of a sex label. High prediction accuracy was obtained with fewer channels, fewer feature values and consideration of muscle sex differences. In future studies, it may be interesting to consider the EMG sex difference factor for the EMG control system design to achieve categorical control of prosthetic and robotic hands.

4.2. Comparison of Classifiers

Although previous studies have proposed various classifiers that can be applied to hand pose pattern recognition (the LDA classifier [42], the k-NN classifier [27], the SVM classifier [32] and the ANN classifier [33]), few comparative studies have been conducted on them. In this study, three classification algorithms (k-NN, SVM and ANN) were applied separately for hand movement recognition. As shown in Figure 5B, for the overall dataset with or without sex labeling, the classification effectiveness of the classifier is ANN, SVM and kNN in descending order. However, for half of the overall dataset, as shown in Figure 5A, the overall classification effectiveness of the classifier is ANN, kNN and SVM in descending order. This also illustrates the claim that there is no optimal classifier but only the most suitable one. Taken together, ANN classifiers have the best classification prediction accuracy when comparing kNN and SVM classifiers. Deep learning (ANN) outperforms the classification results of machine learning (kNN and SVM).

4.3. Algorithm Improvements

In this study, the method of differentiating the dataset according to sex, considering the sex differences in muscle, made the sample size 1/2 of the overall sample. The greatly reduced sample size was detrimental to the training prediction of SVM and ANN algorithms, as indicated in Figure 5. A possible solution would be to select a further 20 subjects (10 female, 10 male) for an identical set of experiments, expanding the original database.
Another method of hand movement recognition that considered muscle sex differences was to add a sex label to the overall sample. Compared to the overall dataset without a sex label (Sum), the prediction accuracy of the overall dataset with a sex label (Sum_Label) was significantly improved. Specifically, the weighting of the sex label also has some effect on the prediction results. Therefore, it may be interesting to adjust the weight of sex label and expand the database in subsequent work.
In summary, the method of adding a sex label preserves all the sample data without loss of data volume compared to the method of the differentiating-the-sex dataset. Moreover, the optimal overall prediction result for the method with the addition of a sex label was 98.4% for the ANN classifier, while the optimal prediction result for the method with a sex-specific dataset was for the ANN classifier, with female: 98.3% and male: 97%. The ANN classifier hand movement recognition algorithm with the addition of a sex label was found to be optimal, considering the utilisation of the sample size and the overall accuracy of the prediction.

4.4. Cross Validation

In addition, the 5-fold cross validation in this study was used as a complementary way to verify whether considering sex differences could improve the prediction performance; see Figure 6. For the results of k-fold cross validation, both the average prediction accuracy and the standard deviation of the prediction accuracy are evaluation indicators. Moreover, this paper used statistical analysis with t-tests to further demonstrate that considering sex differences can significantly improve the prediction accuracy of hand gesture recognition.

4.5. Limitations and Future Work

Although very good outcomes were obtained in this paper, there are a number of limitations that should be addressed in future work. First, only time feature values were considered for movement pattern recognition. Specifically, the frequency domain feature values [41] can also be used for the pattern recognition of hand movements. However, in this paper, only time-domain feature values have been extracted for downscaling and filtering. It may be interesting to see whether the addition of frequency domain feature values for pattern recognition can improve the accuracy of predictions or not in future studies. Another limitation is that this paper has only built offline learning algorithms. In future work, real-time predictive learning [43,44] may be more relevant as applied to prosthetic hand and exoskeleton control.
Overall, the main work of this study is that considering sex differences in muscles would enhance the performance of hand movement pattern recognition systems compared to current movement recognition systems that use sEMG signals as input to have a real clinical and commercial impact.

5. Conclusions

In this study, machine learning algorithms were used for hand movement pattern recognition and considered the effect of adding the sex factor on pattern recognition performance. This research is important because it can be applied to the field of robot-assisted rehabilitation and dexterous prosthetic hand control systems in order to drive hand exoskeletons and prosthetic hands with sEMG signals. To begin with, this paper investigates the statistical analysis of iEMG values calculated from sEMG signals recorded from six muscles of 20 healthy subjects. The results of the statistical analysis indicate differences between right-hand sEMG signals recorded from females and males. In addition, two types of machine learning algorithms for hand gesture recognition that considered sex differences (differentiating sex datasets and adding a sex label) were proposed based on sex differences in sEMG signals of upper limb muscles. These two methods were applied to kNN, SVM and ANN classifiers, demonstrating that taking sex differences into account can improve the accuracy of hand movement recognition predictions. Compared to the above classification algorithms, the ANN classification algorithm with the addition of a sex label has the best classification performance, averaging a prediction accuracy of 98.4%. However, while our data sample size was sufficient to demonstrate the performance of the method, a larger database is necessary for application to clinical studies. In the future, database expansion is a must to ensure robustness and stability of the system. Additionally, our myoelectric control system will be applied to areas such as robotic exoskeletons and human prostheses to achieve better rehabilitation medical results.

Author Contributions

Conceptualization, R.Z.; data curation, R.Z., D.H. and R.W.; formal analysis, R.Z.; funding acquisition, X.Z. and Y.G.; investigation, R.Z.; methodology, R.Z.; project administration, R.Z.; resources, R.Z., X.Z. and Y.G.; software, R.Z.; supervision, R.Z.; validation, R.Z.; visualization, R.Z.; writing—original draft, R.Z.; writing—review and editing, R.Z., X.Z. and Y.G. All authors have read and agreed to the published version of the manuscript.

Funding

The work was supported by the National Natural Science Foundation of China under grant number11972243.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the BIOMEDICAL STUDIES AT TYUT Ethics Committee for Biomedical Investigations (No. TYUT202111001, 1 November 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

Thanks to One Measurement Group Ltd. (http://www.omgl.com.cn, Access date: 25 October 2021) for providing the EMG acquisition equipment.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. iEMG value and Tukey HSD for MT.
Table A1. iEMG value and Tukey HSD for MT.
(I) MT(J) MTMean Difference (I–J)Std. ErrorSig.
MT-1MT-2−345.451 *25.02130.000
MT-3−836.644 *24.41420.000
MT-4−739.289 *23.89410.000
MT-5−1758.426 *23.82410.000
MT-6−1788.877 *24.59630.000
MT-2MT-1345.451 *25.02130.000
MT-3−491.193 *25.45140.000
MT-4−393.838 *24.95290.000
MT-5−1412.975 *24.88590.000
MT-6−1443.426 *25.62610.000
MT-3MT-1836.644 *24.41420.000
MT-2491.193 *25.45140.000
MT-497.355 *24.34410.001
MT-5−921.782 *24.27550.000
MT-6−952.233 *25.03370.000
MT-4MT-1739.289 *23.89410.000
MT-2393.838 *24.95290.000
MT-3−97.355 *24.34410.001
MT-5−1019.137 *23.75230.000
MT-6−1049.588 *24.52670.000
MT-5MT-11758.426 *23.82410.000
MT-21412.975 *24.88590.000
MT-3921.782 *24.27550.000
MT-41019.137 *23.75230.000
MT-6−30.45124.45860.815
MT-6MT-11788.877 *24.59630.000
MT-21443.426 *25.62610.000
MT-3952.233 *25.03370.000
MT-41049.588 *24.52670.000
MT-530.45124.45860.815
(*) represents a significant level of 0.05 for the Mean Difference (I–J).
Table A2. iEMG value and Tukey HSD for MUSCLE.
Table A2. iEMG value and Tukey HSD for MUSCLE.
(I) MUSCLE(J) MUSCLEMean Difference (I–J)Std. ErrorSig.
MUSCLE-1MUSCLE-22557.925 *24.55180.000
MUSCLE-32649.010 *24.55180.000
MUSCLE-42483.068 *24.55180.000
MUSCLE-52101.673 *24.55180.000
MUSCLE-61748.471 *24.55180.000
MUSCLE-2MUSCLE-1−2557.925 *24.55180.000
MUSCLE-391.085 *24.55180.003
MUSCLE-4−74.857 *24.55180.028
MUSCLE-5−456.252 *24.55180.000
MUSCLE-6−809.455 *24.55180.000
MUSCLE-3MUSCLE-1−2649.010 *24.55180.000
MUSCLE-2−91.085 *24.55180.003
MUSCLE-4−165.942 *24.55180.000
MUSCLE-5−547.337 *24.55180.000
MUSCLE-6−900.540 *24.55180.000
MUSCLE-4MUSCLE-1−2483.068 *24.55180.000
MUSCLE-274.857 *24.55180.028
MUSCLE-3165.942 *24.55180.000
MUSCLE-5−381.395 *24.55180.000
MUSCLE-6−734.598 *24.55180.000
MUSCLE-5MUSCLE-1−2101.673 *24.55180.000
MUSCLE-2456.252 *24.55180.000
MUSCLE-3547.337 *24.55180.000
MUSCLE-4381.395 *24.55180.000
MUSCLE-6−353.202 *24.55180.000
MUSCLE-6MUSCLE-1−1748.471 *24.55180.000
MUSCLE-2809.455 *24.55180.000
MUSCLE-3900.540 *24.55180.000
MUSCLE-4734.598 *24.55180.000
MUSCLE-5353.202 *24.55180.000
(*) represents a significant level of 0.05 for the Mean Difference (I–J).

References

  1. Adewuyi, A.A.; Hargrove, L.J.; Kuiken, T.A. An Analysis of Intrinsic and Extrinsic Hand Muscle EMG for Improved Pattern Recognition Control. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 485–494. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Dillingham, T.R.; Pezzin, L.E.; MacKenzie, E.J. Limb amputation and limb deficiency: Epidemiology and recent trends in the United States. South. Med. J. 2002, 95, 875–883. [Google Scholar] [CrossRef] [PubMed]
  3. Ziegler-Graham, K.; MacKenzie, E.J.; Ephraim, P.L.; Travison, T.G.; Brookmeyer, R. Estimating the prevalence of limb loss in the United States: 2005 to 2050. Arch. Phys. Med. Rehabil. 2008, 89, 422–429. [Google Scholar] [CrossRef] [PubMed]
  4. Waryck, B. Comparison Of Two Myoelectric Multi-Articulating Prosthetic Hands. Myoelectric Symp. 2011. [Google Scholar]
  5. Kim, J.; Thayabaranathan, T.; Donnan, G.A.; Howard, G.; Howard, V.J.; Rothwell, P.M.; Feigin, V.; Norrving, B.; Owolabi, M.; Pandian, J.; et al. Global Stroke Statistics 2019. Int. J. Stroke 2020, 15, 819–838. [Google Scholar]
  6. Benjamin, E.J.; Virani, S.S.; Callaway, C.W.; Chamberlain, A.M.; Chang, A.R.; Cheng, S.; Chiuve, S.E.; Cushman, M.; Delling, F.N.; Deo, R.; et al. Heart Disease and Stroke Statistics-2018 Update A Report From the American Heart Association. Circulation 2018, 137, E67–E492. [Google Scholar] [CrossRef] [PubMed]
  7. Lawrence, E.S.; Coshall, C.; Dundas, R.; Stewart, J.; Rudd, A.G.; Howard, R.; Wolfe, C.D. Estimates of the prevalence of acute stroke impairments and disability in a multiethnic population. Stroke 2001, 32, 1279–1284. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Hidler, J.M.; Wall, A.E. Alterations in muscle activation patterns during robotic-assisted walking. Clin. Biomech. 2005, 20, 184–193. [Google Scholar] [CrossRef]
  9. Kwakkel, G.; Kollen, B.J.; Krebs, H.I. Effects of robot-assisted therapy on upper limb recovery after stroke: A systematic review. Neurorehabilit. Neural Repair 2008, 22, 111–121. [Google Scholar] [CrossRef]
  10. Lambercy, O.; Lünenburger, L.; Gassert, R. Robots for Measurement/Clinical Assessment. In Neurorehabilitation Technology; Springer: London, UK, 2012. [Google Scholar]
  11. Chang, W.H.; Yun-Hee, K. Robot-assisted Therapy in Stroke Rehabilitation. J. Stroke 2013, 15, 174–181. [Google Scholar] [CrossRef]
  12. Lambercy, O.; Ranzani, R.; Gassert, R. Robot-assisted rehabilitation of hand function. Rehabil. Robot. 2018, 23, 205–225. [Google Scholar]
  13. Turolla, A. An overall framework for neurorehabilitation robotics: Implications for recovery. Rehabil. Robot. 2018, 9, 15–27. [Google Scholar]
  14. Takahashi, C.D.; Der-Yeghiaian, L.; Le, V.H.; Cramer, S.C. A robotic device for hand motor therapy after stroke. In Proceedings of the 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005, Chicago, IL, USA, 28 June–1 July 2005. [Google Scholar]
  15. Lambercy, O.; Dovat, L.; Gassert, R.; Burdet, E.; Teo, C.L.; Milner, T. A Haptic Knob for Rehabilitation of Hand Function. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 356–366. [Google Scholar] [CrossRef] [PubMed]
  16. Ho, N.S.K.; Tong, K.Y.; Hu, X.L.; Fung, K.L.; Wei, X.J.; Rong, W.; Susanto, E.A. An EMG-driven exoskeleton hand robotic training device on chronic stroke subjects: Task training system for stroke rehabilitation. In Proceedings of the IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011. [Google Scholar]
  17. Schabowsky, C.N.; Godfrey, S.B.; Holley, R.J.; Lum, P.S. Development and pilot testing of HEXORR: Hand EXOskeleton Rehabilitation Robot. J. NeuroEngineering Rehabil. 2010, 7, 1–16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Polygerinos, P.; Galloway, K.C.; Sanan, S.; Herman, M.; Walsh, C.J. EMG controlled soft robotic glove for assistance during activities of daily living. In Proceedings of the IEEE International Conference on Rehabilitation Robotics, Singapore, 11–14 August 2015. [Google Scholar]
  19. Borboni, A.; Mor, M.; Faglia, R. Gloreha-Hand Robotic Rehabilitation: Design, Mechanical Model, and Experiments. J. Dyn. Syst. Meas. Control. 2016, 138, 55–60. [Google Scholar] [CrossRef]
  20. Wang, D.J.; Meng, Q.Y.; Meng, Q.L.; Li, X.W.; Yu, H.L. Design and Development of a Portable Exoskeleton for Hand Rehabilitation. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 2376–2386. [Google Scholar] [CrossRef] [PubMed]
  21. Arteaga, M.V.; Castiblanco, J.C.; Mondragon, I.F.; Colorado, J.D.; Alvarado-Rojas, C. EMG-driven hand model based on the classification of individual finger movements. Biomed. Signal Processing Control. 2020, 58, 101834. [Google Scholar] [CrossRef]
  22. Lenzi, T.; De Rossi, S.M.M.; Vitiello, N.; Carrozza, M.C. Intention-Based EMG Control for Powered Exoskeletons. IEEE Trans. Biomed. Eng. 2012, 59, 2180–2190. [Google Scholar] [CrossRef] [PubMed]
  23. Pulliam, C.L.; Lambrecht, J.M.; Kirsch, R.F. EMG-Based Neural Network Control of Transhumeral Prostheses. J. Rehabil. Res. Dev. 2011, 48, 739. [Google Scholar] [CrossRef]
  24. Fougner, A.; Stavdahl, O.; Kyberd, P.J.; Losier, Y.G.; Parker, P.A. Control of Upper Limb Prostheses: Terminology and Proportional Myoelectric Control—A Review. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 20, 663–677. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Hayashi, H.; Furui, A.; Kurita, Y.; Tsuji, T. A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution. IEEE Trans. Biomed. Eng. 2017, 64, 2372–2381. [Google Scholar]
  26. Phinyomark, A.; Phukpattaranont, P.; Limsakul, C. Feature reduction and selection for EMG signal classification. Expert Syst. Appl. 2012, 39, 7420–7431. [Google Scholar] [CrossRef]
  27. Naik, G.R.; Al-Timemy, A.H.; Nguyen, H.T. Transradial Amputee Gesture Classification Using an Optimal Number of sEMG Sensors: An Approach Using ICA Clustering. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 24, 837–846. [Google Scholar] [CrossRef] [PubMed]
  28. Hudgins, B.; Parker, P.; Scott, R.N. A new strategy for multifunction myoelectric control. IEEE Trans. Bio-Med. Eng. 1993, 40, 82–94. [Google Scholar] [CrossRef] [PubMed]
  29. Du, Y.-C.; Lin, C.-H.; Shyu, L.-Y.; Chen, T. Portable hand motion classifier for multi-channel surface electromyography recognition using grey relational analysis. Expert Syst. Appl. 2010, 37, 4283–4291. [Google Scholar] [CrossRef]
  30. Robinson, C.P.; Li, B.; Meng, Q.; Pain, M.T.G. Pattern Classification of Hand Movements using Time Domain Features of Electromyography. In Proceedings of the 4th International Conference on Movement and Computing, New York, NY, USA, 14–17 February 2015. [Google Scholar]
  31. Al-Timemy, A.H.; Bugmann, G.; Escudero, J.; Outram, N. Classification of Finger Movements for the Dexterous Hand Prosthesis Control With Surface Electromyography. IEEE J. Biomed. Health Inform. 2013, 17, 608–618. [Google Scholar] [CrossRef] [PubMed]
  32. Oskoei, M.A.; Hu, H. Support vector machine-based classification scheme for myoelectric control applied to upper limb. IEEE Trans. Biomed. Eng. 2008, 55, 1956–1965. [Google Scholar] [CrossRef] [PubMed]
  33. Ariyanto, M.; Caesarendra, W.; Mustaqim, K.A.; Irfan, M.; Pakpahan, J.A.; Setiawan, J.D.; Winoto, A.R. Finger Movement Pattern Recognition Method Using Artificial Neural Network Based on Electromyography (EMG) Sensor. In Proceedings of the 2015 International Conference on Automation, Cognitive Science, Optics, Micro Electro-Mechanical System, and Information Technology, Bandung, Indonesia, 29–30 October 2015; pp. 12–17. [Google Scholar]
  34. Martinez, R.; Bouffard, J.; Michaud, B.; Plamondon, A.; Cote, J.N.; Begon, M. Sex differences in upper limb 3D joint contributions during a lifting task. Ergonomics 2019, 62, 682–693. [Google Scholar] [CrossRef] [PubMed]
  35. Bouffard, J.; Martinez, R.; Plamondon, A.; Cote, J.N.; Begon, M. Sex differences in glenohumeral muscle activation and coactivation during a box lifting task. Ergonomics 2019, 62, 1327–1338. [Google Scholar] [CrossRef]
  36. Hunter, K.S. Fatigability of the elbow flexor muscles for a sustained submaximal contraction is similar in men and women matched for strength. J. Appl. Physiol. 2004, 96, 195–202. [Google Scholar] [CrossRef] [Green Version]
  37. Kent-Braun, J.A.; Ng, A.V.; Doyle, J.W.; Towse, T.F. Human skeletal muscle responses vary with age and gender during fatigue due to incremental isometric exercise. J. Appl. Physiol. 2002, 93, 1813–1823. [Google Scholar] [CrossRef] [PubMed]
  38. Manjuanth, H.; Venkatesh, D.; Rajkumar, S.; Taklikar, R.H. Gender difference in hand grip strength and electromyogram (EMG) changes in upper limb. Res. J. Pharm. Biol. Chem. Sci. 2015, 6, 1889–1893. [Google Scholar]
  39. Kambayashi, I.; Ishimura, N.; Kobayashi, K.; Sagawa, M.; Takeda, H. Differences in Muscle Oxygenation Level According to Gender during Isometric Hand-grip Exercises. J. Hokkaido Univ. Educ. Nat. Sci. 2003, 54, 89–96. [Google Scholar]
  40. Zeng, H.B.; Li, K.; Tian, X.C.; Wei, N.; Song, R.; Zhou, L.L. Classification of hand motions using linear discriminant analysis and support vector machine. In Proceedings of the 2017 Chinese Automation Congress (CAC), Jinan, China, 20–22 October 2017. [Google Scholar]
  41. Du, S.J.; Vuskovic, M. Temporal vs. spectral approach to feature extraction from prehensile EMG signals. Proceedings of IEEE International Conference on Information Reuse and Integration, Las Vegas, NV, USA, 8–10 November 2021; pp. 344–350. [Google Scholar]
  42. Zhang, D.H.; Xiong, A.B.; Zhao, X.G.; Han, J.D. PCA and LDA for EMG-based Control of Bionic Mechanical hand. Proceedings of IEEE International Conference on Information and Automation (ICIA), Shenyang, China, 6–8 June 2012; pp. 960–965. [Google Scholar]
  43. Edwards, A.L.; Dawson, M.R.; Hebert, J.S.; Sherstan, C.; Sutton, R.S.; Chan, K.M.; Pilarski, P.M. Application of real-time machine learning to myoelectric prosthesis control: A case series in adaptive switching. Prosthet. Orthot. Int. 2016, 40, 573–581. [Google Scholar]
  44. Pilarski, P.M.; Dawson, M.R.; Degris, T.; Carey, J.P.; Sutton, R.S. Dynamic switching and real-time machine learning for improved human control of assistive biomedical robots. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012. [Google Scholar]
Figure 1. Experimental procedure for the acquisition of sEMG signals. (a) Acquisition settings: each person independently performs 6 hand-gestures(RM, TB, TI, FF, HC and FK). The 6 hand-gestures represent the daily hand functions of using mouse: pressing the thumb button, pinching objects with the thumb and index finger, closing five fingers and grasping cylindrical or spherical objects. (b) Raw EMG signal of RM hand-gesture. (c) Time diagram of the experimental scheme.
Figure 1. Experimental procedure for the acquisition of sEMG signals. (a) Acquisition settings: each person independently performs 6 hand-gestures(RM, TB, TI, FF, HC and FK). The 6 hand-gestures represent the daily hand functions of using mouse: pressing the thumb button, pinching objects with the thumb and index finger, closing five fingers and grasping cylindrical or spherical objects. (b) Raw EMG signal of RM hand-gesture. (c) Time diagram of the experimental scheme.
Applsci 12 02962 g001
Figure 2. The statistical analysis results of iEMG values for six hand movements of the same muscle pair in the right hand of female and male. (A) RM; (B) TB; (C) TI; (D) FF; (E) HC; (F) FK. (*) represents p < 0.05 between the right hand of the female and male. Error bars represent standard error.
Figure 2. The statistical analysis results of iEMG values for six hand movements of the same muscle pair in the right hand of female and male. (A) RM; (B) TB; (C) TI; (D) FF; (E) HC; (F) FK. (*) represents p < 0.05 between the right hand of the female and male. Error bars represent standard error.
Applsci 12 02962 g002
Figure 3. Three classifiers (kNN, SVM and ANN) were applied to the confusion matrix (7 × 7) generated by differentiating-the-sex dataset method for test-set hand gesture recognition. Precision represents the prediction accuracy of each gesture, and Recall represents the predicted coverage of each gesture. Precision = TP TP + FP × 100 %   Recall = TP TP + FN × 100 % where TP = true positive; FP = false positive; and FN = false negative.
Figure 3. Three classifiers (kNN, SVM and ANN) were applied to the confusion matrix (7 × 7) generated by differentiating-the-sex dataset method for test-set hand gesture recognition. Precision represents the prediction accuracy of each gesture, and Recall represents the predicted coverage of each gesture. Precision = TP TP + FP × 100 %   Recall = TP TP + FN × 100 % where TP = true positive; FP = false positive; and FN = false negative.
Applsci 12 02962 g003
Figure 4. Three classifiers (kNN, SVM and ANN) were applied to the confusion matrix (7 × 7) generated by adding a sex labeling method for test-set hand gesture recognition. Precision represents the prediction accuracy of each gesture, and Recall represents the predicted coverage of each gesture. Precision = TP TP + FP × 100 %   Recall = TP TP + FN × 100 % where TP = true positive; FP = false positive; and FN = false negative.
Figure 4. Three classifiers (kNN, SVM and ANN) were applied to the confusion matrix (7 × 7) generated by adding a sex labeling method for test-set hand gesture recognition. Precision represents the prediction accuracy of each gesture, and Recall represents the predicted coverage of each gesture. Precision = TP TP + FP × 100 %   Recall = TP TP + FN × 100 % where TP = true positive; FP = false positive; and FN = false negative.
Applsci 12 02962 g004
Figure 5. Total prediction accuracy statistics of three classifiers (kNN, SVM and ANN) applied to a test set for hand gesture recognition for two methods considering sex differences: (A) differentiating-the-sex dataset and (B) adding a sex labeling.
Figure 5. Total prediction accuracy statistics of three classifiers (kNN, SVM and ANN) applied to a test set for hand gesture recognition for two methods considering sex differences: (A) differentiating-the-sex dataset and (B) adding a sex labeling.
Applsci 12 02962 g005
Figure 6. Results of statistical analysis of the total prediction accuracy of three classifiers (kNN, SVM and ANN) with 5-fold cross validation, applied to a hand gesture recognition test set of two methods considering sex differences. (A) Differentiation of sex datasets. (B) Addition of a sex label. (*) represents p < 0.05 between the total prediction accuracy of the two datasets. Error bars represent standard errors.
Figure 6. Results of statistical analysis of the total prediction accuracy of three classifiers (kNN, SVM and ANN) with 5-fold cross validation, applied to a hand gesture recognition test set of two methods considering sex differences. (A) Differentiation of sex datasets. (B) Addition of a sex label. (*) represents p < 0.05 between the total prediction accuracy of the two datasets. Error bars represent standard errors.
Applsci 12 02962 g006
Table 1. Time features extracted from the EMG signal.
Table 1. Time features extracted from the EMG signal.
DomainFeatureFormulation
TimeIntegral Electromyographic (iEMG) t t + T | EMG ( t ) | dt
TimeMean Absolute Value (MAV) 1 N i = 1 N | x i |
TimeInput Contribution Rate (ICRi) t t + T | EMG ( t ) i | dt i = 1 N t t + T | EMG ( t ) i | dt
TimeVariance (VAR) 1 N 1 i = 1 N x i 2
Table 2. The dataset descriptions.
Table 2. The dataset descriptions.
MethodDataset NameDataset Description
Differentiating-the-sex datasetFemaleiEMG values for 10 females
MaleiEMG values for 10 males
Half_F&MiEMG values for 5 males and 5 females
Adding a sex labelingSumiEMG values for 10 males and 10 females without sex label
Sum_LabeliEMG values for 10 males and 10 females with sex label
Table 3. The optimal configurations of k-NN, SVM and ANN classifiers for differentiating sex datasets obtained from the training set.
Table 3. The optimal configurations of k-NN, SVM and ANN classifiers for differentiating sex datasets obtained from the training set.
ClassifierDatasetNeighborsKernelCoefficientNeurons
Layer 1
Neurons
Layer 2
Neurons
Layer 3
Activation
Function
Learning
Rate (α)
KNNFemale2-------
KNNMale4-------
KNNHalf_F&M3-------
SVMFemale-Gaussian9520-----
SVMMale-Gaussian600-----
SVMHalf_F&M-Gaussian9520-----
ANNFemale---12832-Relu10−4
ANNMale---1286432Relu10−4
ANNHalf_F&M---1286432Relu10−4
Table 4. The optimal configurations of k-NN, SVM and ANN classifiers for adding a sex labeling obtained from the training set.
Table 4. The optimal configurations of k-NN, SVM and ANN classifiers for adding a sex labeling obtained from the training set.
ClassifierDatasetNeighborsKernelCoefficientNeurons
Layer 1
Neurons
Layer 2
Neurons
Layer 3
Activation
Function
Learning
Rate(α)
KNNSum4-------
KNNSum_Label3-------
SVMSum-Gaussian17,000-----
SVMSum_Label-Gaussian3000-----
ANNSum---1286432Relu10−4
ANNSum_Label---1286432Relu10−4
Table 5. The optimal configurations of k-NN, SVM and ANN classifiers for differentiating sex datasets obtained from the validation set.
Table 5. The optimal configurations of k-NN, SVM and ANN classifiers for differentiating sex datasets obtained from the validation set.
ClassifierDatasetNeighborsKernelCoefficientNeurons
Layer 1
Neurons
Layer 2
Neurons
Layer 3
Activation
Function
Learning
Rate (α)
KNNFemale2-------
KNNMale2-------
KNNHalf_F&M5-------
SVMFemale-Gaussian610-----
SVMMale-Gaussian3850-----
SVMHalf_F&M-Gaussian17,000-----
ANNFemale---1286432Relu10−4
ANNMale---1286432Relu10−4
ANNHalf_F&M---1286432Relu10−4
Table 6. The optimal configurations of k-NN, SVM and ANN classifiers for adding a sex labeling obtained from the validation set.
Table 6. The optimal configurations of k-NN, SVM and ANN classifiers for adding a sex labeling obtained from the validation set.
ClassifierDatasetNeighborsKernelCoefficientNeurons
Layer 1
Neurons
Layer 2
Neurons
Layer 3
Activation
Function
Learning
Rate (α)
KNNSum6-------
KNNSum_Label4-------
SVMSum-Gaussian1000-----
SVMSum_Label-Gaussian590-----
ANNSum---1286432Relu10−4
ANNSum_Label---1286432Relu10−4
Table 7. Tests of between-subject effects (dependent variable: iEMG value).
Table 7. Tests of between-subject effects (dependent variable: iEMG value).
SourceType III Sum of SquaresdfMean SquareFSig.
Corrected Modela35,886,260,844.38171505,440,293.583589.0410.000
Intercept28,887,146,915.346128,887,146,915.34633,665.1380.000
MT7,338,848,044.61951,467,769,608.9241710.5420.000
SEX354,125,990.4251354,125,990.425412.6990.000
MUSCLE14,477,477,144.68952,895,495,428.9383374.4160.000
MT ∗ SEX56,599,280.705511,319,856.14113.1920.000
MT ∗ MUSCLE12,753,904,623.64925510,156,184.946594.5370.000
SEX ∗ MUSCLE216,823,324.470543,364,664.89450.5370.000
MT ∗ SEX ∗ MUSCLE332,584,288.4192513,303,371.53715.5040.000
Error14,595,822,050.54417,010858,073.019
aR Squared = 0.711 (Adjusted R Squared = 0.710).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, R.; Zhang, X.; He, D.; Wang, R.; Guo, Y. sEMG Signals Characterization and Identification of Hand Movements by Machine Learning Considering Sex Differences. Appl. Sci. 2022, 12, 2962. https://doi.org/10.3390/app12062962

AMA Style

Zhang R, Zhang X, He D, Wang R, Guo Y. sEMG Signals Characterization and Identification of Hand Movements by Machine Learning Considering Sex Differences. Applied Sciences. 2022; 12(6):2962. https://doi.org/10.3390/app12062962

Chicago/Turabian Style

Zhang, Ruixuan, Xushu Zhang, Dongdong He, Ruixue Wang, and Yuan Guo. 2022. "sEMG Signals Characterization and Identification of Hand Movements by Machine Learning Considering Sex Differences" Applied Sciences 12, no. 6: 2962. https://doi.org/10.3390/app12062962

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop