Next Article in Journal
Tangible User Interface and Mu Rhythm Suppression: The Effect of User Interface on the Brain Activity in Its Operator and Observer
Next Article in Special Issue
Tunable-Q Wavelet Transform Based Multiscale Entropy Measure for Automated Classification of Epileptic EEG Signals
Previous Article in Journal
Numerical Control Machine Tool Fault Diagnosis Using Hybrid Stationary Subspace Analysis and Least Squares Support Vector Machine with a Single Sensor
Previous Article in Special Issue
Global Synchronization of Multichannel EEG   Based on Rényi Entropy in Children with Autism  Spectrum Disorder
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Gender Recognition during Stepping Activity for Rehab Application Using the Combinatorial Fusion Approach of EMG and HRV

1
Malaysia-Japan International Institute of Technology, Universiti Teknologi Malaysia, Jalan Sultan Yahya Petra, Kuala Lumpur 54100, Malaysia
2
Department of Biotechnology and Medical Engineering, Faculty of Biosciences and Medical Engineering, Universiti Teknologi Malaysia, Skudai 81310, Malaysia
3
Department of Bioscience and Engineering, College of System Engineering, Shibaura Institute of Technology, Fukasaku 307, Saitama 337-8570, Japan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2017, 7(4), 348; https://doi.org/10.3390/app7040348
Submission received: 18 January 2017 / Revised: 27 March 2017 / Accepted: 27 March 2017 / Published: 31 March 2017

Abstract

:
Gender recognition is trivial for a physiotherapist, but it is considered a challenge for computers. The electromyography (EMG) and heart rate variability (HRV) were utilized in this work for gender recognition during exercise using a stepper. The relevant features were extracted and selected. The selected features were then fused to automatically predict gender recognition. However, the feature selection for gender classification became a challenge to ensure better accuracy. Thus, in this paper, a feature selection approach based on both the performance and the diversity between the two features from the rank-score characteristic (RSC) function in a combinatorial fusion approach (CFA) (Hsu et al.) was employed. Then, the features from the selected feature sets were fused using a CFA. The results were then compared with other fusion techniques such as naive bayes (NB), decision tree (J48), k-nearest neighbor (KNN) and support vector machine (SVM). Besides, the results were also compared with previous researches in gender recognition. The experimental results showed that the CFA was efficient and effective for feature selection. The fusion method was also able to improve the accuracy of the gender recognition rate. The CFA provides much better gender classification results which is 94.51% compared to Barani’s work (90.34%), Nazarloo’s work (92.50%), and other classifiers.

1. Introduction

Rehabilitation is a physiotherapy activity to recover partially or totally the motor abilities of a patient. Usually, physiological devices (PDs) such as steppers, upright bicycles and treadmills are used to assist in the rehabilitation process. One of the PDs used in this study is a modified stepper rather than a conventional stepper. It was built using a smart material known as Magnetorheological (MR) fluid; installed as an MR valve on the stepper. The conventional stepper machine was slightly modified and can be controlled using the concept of MR valve (see Figure 1) for rehab applications. The mechanical properties such as stiffness, viscosity, and hardness of the MR-based damper will change subjected to the induced magnetic fluxes. The modified stepper is a current-induced device that varies the stiffness level (i.e., low, medium, high) of the damper. The characteristics of the MR fluid require the users to make an effort to do lower limb exercise resistance from a low to high stiffness level on the stepper. In addition, before the Ampere current is induced to the MR-based stepper, the system interface should have the ability to identify the gender of a person using it. The main reason is that the strength level of men and women is different [1,2,3]. Thus, if the same load is applied to both men and women during exercise, it will cause unhealthy fitness, for example, it will cause them to be tired, unfit and other similar states.

1.1. The Need for Gender Recognition Using Biometrics

In conjunction to that, there is a need for a system development without any human intervention, in this case from a physiotherapist, such as automatic gender recognition. Gender recognition is trivial for a physiotherapist or the population in general. However, it is considered very challenging for computers. Gender recognition can be very helpful in association in terms of recognizing the individual user’s personality. Currently, computer vision systems play imperative roles in our life with the development of Visual Surveillance and Human-Computer Interaction (HMI) innovation. Gender recognition, as a part of the entire computer vision system has acquired much attention in recent decades. This is due to gender recognition systems that can be utilized in many areas such as airport security [4], surveillance and security systems [5]. Besides, it is also very helpful in rehabilitation whereby the computer interface can directly recognize the gender of a person, and consequently determine and set the input of the PDs like a current input associated with the strength level of men and women. This is considered as the visual surveillance or automated monitoring in the rehabilitation process.
In addition, a biometric recognition provides rigorous security by recognizing an individual based on the physiological characteristics [6]. Various biometric modalities have been explored previously such as iris, fingerprint, face, and hand. Table 1 shows gender recognition based on the face in [7,8,9,10,11]. In [12,13,14,15], a gender is recognized based on human gait, and it can also be recognized based on footwear, color information, and human speech [16,17,18,19]. Nevertheless, these biometrics are not sufficiently robust against counterfeiting. For example, the iris can be duplicated by using contact lenses with duplicated iris features, the fingerprint can be imitated based on latex and face can be falsified using an artificial mask. Besides, eligible performance based on recognition accuracy can also not be provided by this biometrics.

1.2. Gender Recognition Using Physiological Signal

Meanwhile, other information from physiological biometrics that reflects neurological changes such as electromyography (EMG) and autonomic behavior such as heart rate variability (HRV) can also affect gender recognition. Gender recognition using new physiological biometrics (e.g., EMG, ECG, and EEG) is quite complicated compared to face, iris and fingerprint physiological biometrics. This is because the signal of ECG and EMG recordings can be changed or distorted due to the physical status of the subject, movements, powerline interference and certain medications [20,21]. However, the EMG and ECG biometrics information are difficult to be stolen and impossible to imitate. So, it is well secured and confidential [22].
Much basic research on electrocardiogram (ECG) has been actively conducted. Several suggestions proposed the feasibility of utilizing the ECG as another biometric modality for identity identification [23,24]. The fact that the physiological differences of the heart in various people show certain uniqueness in their ECG signals have supported why it is logical for biometric recognition [25]. Gender recognition indicated by HRV is a standout amongst the most challenging issues in individual identification [26].
Furthermore, digital information about the physical muscles is provided by EMG. Although there a few analysis on EMG involving the lower limb [27], a research work on gender recognition during the stepping activity using EMG is yet to be investigated. Interestingly, there is a significant time delay difference between genders during the step [28], which indicates that there must be some relevant features in human muscles that help in differentiating gender.
Multimodal physiological biometrics are a combination of at least two physiological biometric features in order to handle certain limitations of utilizing them separately. There have been some previous attempts to fuse multiple biometrics such as combining the face, gait, fingerprint, palm print and geometry for gender recognition [29,30,31]. Table 2 shows more related works for the fusion of physiological biometrics and the gap of this study.
Therefore, the bimodal physiological systems for automatic gender recognition were carried out in this study to develop the mechanism of physiotherapy for online monitoring. To guarantee the accurate signal of a single modality system, the fusion of two or more signals is significant to improve the performance of the system [39,41,42,43,44,45,46,47,48]. To achieve robust and discriminative performance for gender recognition, a fusion of EMG and HRV [40,41,49] is proposed before feeding into the interface system in order to control the level of MR valve stiffness of the stepper before the rehabilitation exercise can be emulated [40,41]. Two modalities ECG (HRV) and EMG are selected to be combined because both modalities are closely related when performing exercise associated with lower limb. It seems that the gender factor motivates people differently, in performing the regular exercise for rehab. The reduction of cost and frequent monitoring are also the ultimate objectives in the improvement of rehabilitation process based on the gender physiological feedback control. Hence, the automated online monitoring system can be developed and a bimodal fusion system can be achieved.
The overview of this research is shown in Figure 2. Firstly, all the raw data are preprocessed for artifact corrections. Next, all the features are extracted and the most relevant features are selected based on the performance and the diversity between the two feature sets. Then, the Combinatorial Fusion Algorithm (CFA) (Hsu et al. [50,51,52]) is implemented to fuse the EMG and HRV features for automatic gender recognition during stepping which may help to isolate male and female subjects before undergoing the rehabilitation process. Finally, the gender classification based on the feature fusion correct rate (%) is obtained. However, the section outside the bold solid line (i.e., the interface section, control signal section and physiological device section) is out of the scope of this study.
The organization of this paper is as follows. Section 2 describes the research steps involved in automatic gender recognition during the stepping activity using the MR-based stepper in rehabilitation. The steps include data acquisition and preprocessing of EMG and HRV. The HRV is quantified from the ECG signals and both EMG and HRV features are extracted. Section 3 represents the feature selection using performance and cognitive diversity. The implementation of the CFA to fuse the features is conducted in Section 4. The experimental results are analyzed and discussed in Section 4.2. Finally, the conclusion and the recommendation of future works resulted from this paper is suggested in Section 5.

2. Methodology

The proposed young gender recognition based on the EMG and HRV biometrics is illustrated in Figure 3. The proposed method is composed of a sequence of processing steps which are the preprocessing of the EMG and ECG signals, feature extraction, feature selection based on the performance and the diversity between EMG and HRV feature sets, and EMG and HRV feature fusion process based on the Combinatorial Fusion Approach (CFA) (Hsu, Chung and Kristal [51] and Hsu, Kristal and Schweikert [52]). In this paper, the feature selection and the feature fusion have been focused as shown in the bold dotted line.

2.1. Data Acquisition

The EMG and ECG data used in this study were acquired from 59 healthy, untrained young volunteers (i.e., 23 years old on average). All the subjects were considered free from any disease and have been briefed about the procedures involved, and informed consents have been obtained. An EMG channel was simultaneously recorded with ECG using the Porti system from the Twente Medical Systems International (TMSi) equipment. The Porti is a 32-channel ambulatory and stationary system for physiological research. An EMG channel was obtained from the vastus lateralis muscle (VL), placed according to the recommendations of Surface Electromyography for the Non-Invasive Assessment of Muscles (SENIAM) [53]. In summary, 16 European expertise groups are working together with the SENIAM project for the development and applications of surface EMG (sEMG). The recommendation of the sEMG electrode placement is partly based on simulation outputs. The sampling frequency of raw EMG and ECG biosignals used in this study was about 2048 Hz. Higher sampling frequency was chosen to prevent a signal loss and to avoid too much noise during the stepping exercise. Plus, it is also to avoid jitter in the estimation of the R-wave fiducial point for ECG signal. Also, the stepper machine was used to carry out the stepping activity. In order to get the same stepping rate among the subjects, a metronome (i.e., 60 beats per minute) was used simultaneously with the stepping activity. Generally, a metronome is a device used by musicians that indicate the time at a selected rate by giving a regular tick.

2.2. Biosignals Preprocessing

ECG: The Kubios HRV software [54] is utilized for ECG preprocessing to quantify the HRV. It is an advanced software and extremely easy to use for HRV analysis. It consists of a mechanism for artifact correction (i.e., powerline interference, baseline wander and other noise components), trend component removal, an adaptive QRS detection and analysis for sample selection. The QRS detection algorithm in the Kubios HRV software was developed based on the Pan-Tompkins algorithm [55]. Figure 4a shows the patterns of the HRV.
EMG: All the EMG signals were processed using MATLAB. There was a low amplitude voltage offset present in the hardware. The direct current (DC) power was eliminated. Then a bandpass filter was applied to the raw data to remove any movement artifacts with 20 Hz and 300 Hz cutoff frequencies. Since the raw EMG signal had positive and negative values [56], the full-wave rectification was applied to make sure the raw EMG signal did not have nearly zero mean and to analyze positive values only. The EMG signals linear envelope was then computed for further processing. Figure 4b shows the patterns of the EMG.

2.3. Features Extraction

The features are used in order to classify the males and females during the stepping activity using a stepper. The selected features were extracted using Kubios software [54]. The descriptions for HRV features in the Standards of Measurement, Physiological Interpretation, and Clinical Use [57] were utilized. For EMG features, the conventional features are the mean [58], standard deviation [58,59,60], root mean square [59,60,61], maximum amplitude [60], median frequency [61,62], mean frequency [62], variance [63], skewness [64] and kurtosis [64,65]. Table 3 describes a detailed description of the extracted features.

3. Combinatorial Fusion Approach (CFA)

Combinatorial Fusion (CF) (Hsu et al. [50,51,52]) gives a useful piece of fusion method information in the analysis of a set of Multiple Scoring Systems (MSS) to obtain better accuracy and higher effectiveness of feature characteristics. Each system contains a score function, a rank function, and a Rank-Score Characteristic (RSC) function (see Figure 5). The scoring or ranking behavior is indicated by the RSC function and it is extremely easy to compute. Plus, the cognitive diversity of at least two scoring systems can be measured [52]. Furthermore, the CFA method has been widely applied in some research areas such as visual cognition systems [66], ChIP-seq peak detection [67], video target tracking [68] and the prediction of protein structure [69] when the result was obtained by using an RSC graph to assess the scoring diversity [52]. Meanwhile, the CFA was implemented in this work in order to demonstrate that the fusion performance of HRV and EMG for automatic gender recognition during the stepping activity can be improved by fusing score or rank combinations using the RSC function as an indicator.
Figure 5 illustrates the RSC function in a set of MSS. Let T = {t 1 , t 2 , ..., t n } be a set of subjects under study. Let N = [1, n] be the integers set from 1 to n and R be the set of real numbers. Next, let A be the scoring system. Each scoring system A is one of the q scoring systems A 1 , A 2 , ..., A q on the set T, contains two functions which are:
(a)
a score function, s A , and
(b)
a rank function, r A .
(a)
a score function, s A , and
(b)
a rank function, r A .
The r A are derived by sorting the s A . A Rank-Score Characteristics (RSC) function, f A was defined (see [51,52]) as: f A : NR while a score function, s A , and a rank function, r A are shown in Figure 5. There are two distinct forms of procedures that can be applied to integrate a set of q scoring systems namely:
(a)
Score Combination, sc.
(b)
Rank Combination, rc.
and the equations are defined as follows:
s s c ( t i ) = j = 1 q ( s A j ( t i ) ) / q
s r c ( t i ) = j = 1 q ( r A j ( t i ) ) / q
where t i is in T.
Moreover, the diversity between A and B, for a pair of two scoring systems, d(A, B) is defined as d(f A , f B ), between RSC functions f A and f B (see [51,52]).

3.1. Feature Selection Using Performance and Cognitive Diversity

The features from the EMG and HRV are viewed as a scoring system. The performance and the diversity between the feature sets in MSS have been carried out to select the relevant and non-redundant features, which can yield a better result when the features are fused. Thus the combination of the MSS can improve the result when the features of each of the individual scoring systems have relatively high performance and high diversity [52]. Therefore the aim of our feature selection is to find the features with relatively good performance as well as relatively high diversity.
In our experiment, there are 32 features in total and 59 possible values for each feature. The correct rate is calculated to measure the performance and to make comparisons between the machine learning classifiers (see Figure 6). The classifiers used in this work are Naïve Bayes (NB), k-Nearest Neighbors (k-NN with k = 1, 3, 5), and Decision Tree (J48). As shown in Figure 6, the highest average correct rate is J48 classifier with 72.19% of accuracy.
After that, the final performance of each feature from the J48 classifier is sorted in decreasing order as illustrated in Figure 7. It can be seen that the highest correct rate with a rate of 86.44% is generated from feature O (LF_power (ms 2 )) and feature d (maximum amplitude of EMG (ms 2 )) generates the lowest correct rate of with a rate of 66.1%. From previous work in [70], there are significant gender differences (p < 0.05) in the LF in male subjects compared to females. This implies that the male subjects are more active and dominant than the female subjects. Therefore, this is particularly suitable to be used to isolate male and female subjects in developing the rehabilitation devices, control system, and rehabilitation engineering.

3.1.1. The Diversity between MSS

Generally, the diversity of a feature pair { F 1 , F 2 } can be derived as in Equation (3), where f 1 and f 2 are the RSC function represented for F 1 and F 2 respectively in which they can be calculated based on the equation denoted in Figure 5. Both functions f 1 and f 2 have n score values in total and a rank sequence ranging from 1 to n.
d ( F 1 , F 2 ) = i = 1 n | f 1 ( i ) f 2 ( i ) | / n
The diversity of feature sets D = { F 1 , F 2 , , F n } is generated as in Equation (4). C ( n , 2 ) depicts the total number of the combination of two features in the feature set D.
d ( D ) = F i ϵ D , F j ϵ D d ( F i , F j ) / | C ( n , 2 ) |
In addition, the diversity between the two feature sets D 1 and D 2 is derived as in Equation (5). Generally, | D 1 | and | D 2 | are the cardinal number of D 1 and D 2 .
d ( D 1 , D 2 ) = F i ϵ D , F j ϵ D d ( F i , F j ) / | D 1 | . | D 2 |

3.1.2. Feature Selection Approach

The approach for feature selection based on the performance and the diversity between the two features (Deng, Wu, Chu, Zhang, and Hsu [46]) is illustrated in Figure 8.
Figure 9 illustrates the modalities, performance, and diversity of the features. All 32 features are divided into two modalities according to the corresponding sensor types which are EMG and ECG. The feature performance in each of the EMG and ECG physiological signals is sorted in decreasing order. 15 features with a bold font demonstrated in the figure is selected for further process and is denoted as the top features. While a number above the horizontal link between two modalities is denoted as the diversity between the two feature sets. Besides, a number above the curve within a node of the modality is the average of a feature set diversity. Generally, a better fusion performance is produced when the diversity between two modalities is larger [46]. Apart from that, better individual feature performance also provides a better combination result [46]. However, only two modalities were utilized in this work but the comparison of the diversity of the two modalities is not required. In this work, the diversity between EMG and HRV generates a better diversity of a feature set and the diversity between two feature sets. Hence, a better combination performance can be achieved.

3.2. Feature Selection Performances

In the feature selection process, the top-15 features which are O, V, A, F, C, G, M and I from the HRV and a, b, c, e, f, g, and i from the EMG are selected. All of these top-15 features are selected because of their performance and the diversity is higher than the other features. A total of four 5-feature sets (A), four 7-feature sets (C), four 9-feature sets (D) and four 11-feature sets (E) are then generated. Plus, four 5-feature sets (B) are also randomly selected from 32 extracted features for comparison of their performances. The average performance and average diversity of selected features are demonstrated in Table 4.

4. CFA for EMG and HRV Feature Fusion

In the feature fusion process, the score combination for all 20 selected features resulting from Table 4 is performed one-by-one according to Equation (1).

4.1. Fusion Results in 5-Feature Sets

The feature’s fusion correct rate using the score combination of CFA for both of 5-feature sets (A) and (B) are illustrated in Figure 10 and Figure 11.
With reference to Figure 10 and Figure 11, the dotted lines show the average of 26 combined features. In this paper, the combined features that have the correct rate over an average of 26 combined features will be the better selected combined features for the automatic gender identification. In Figure 10a, the average correct rate is 95.49%, and it possesses the top-13 combined features which are OV, OA, OVA, OVF, OF, VA, OVAF, OAF, VF, VAF, AF, OVAa, and OVAFa. Next, in Figure 10b, the average correct rate is 93.53% and it possesses the top-13 combined features which are OV, OA, OVA, VA, OVAb, OVAc, OVb, OVc, OAb, OAc, VAb, VAc, and OVAbc. Then, in Figure 10c, the average correct rate is 93.73% and it possesses the top-13 combined features which are OV, OVF, OF, VF, OVFa, OVa, OVFb, OVb, OFa, OFb, VFa, Oa, and OVFab. Lastly, in Figure 10d, the average correct rate is 95.27% and it possesses the top-13 combined features which are OV, OA, OVA, OVF, OF, VA, OVAF, OAF, VF, VAF, AF, OVAFb, and OVAb. Thus, in short, all four 5-feature sets (A) have their own top-13 combined features that contribute to automatic gender identification during the stepping activity using the stepper.
Moreover, in Figure 11a, the average correct rate is 78.36% and it possesses the top-12 combined features which are IP, IU, IPU, Ic, IPc, IPUc, IUc, PU, IPd, IPUd, Pc, and PUc. Next, in Figure 11b, the average correct rate is 77.75% and it possesses the top-12 combined features which are HR, HRa, Ha, Ra, HRh, HER, HRah, EHRa, Hh, EH, Hah, and EHRh. Then, in Figure 11c, the average correct rate is 80.34% and it possesses the top-12 combined features which are HI. HIJ, IJ, HJ, HIc, HIJc, IJc, Ic, HJc, HIJd, Hc, and HId. Lastly, in Figure 11d, the average correct rate is 77.85% and it possesses the top-12 combined features which are OV, IQ, IS, IQS, QS, IQi, IQSi, Ii, ISi, IQSd, IQd, ISd, and Id. Thus, all the four 5-feature sets (B) have their own top-12 combined features.
All the figures from Figure 10 and Figure 11, shows that the four 5-feature sets (A) have higher correct rate compared to the four 5-feature sets (B). Thus, the overall performance of the 5-feature sets (A) is considerably better than that of the 5-feature sets (B) for automatic gender identification during the stepping activity.
Besides that, the fusion of t-feature sets where t = 7, 9 and 11 is also calculated. 7-feature sets have 120 combined features, 9-feature sets have 502 combined features and 11-feature sets have 2036 combined features in total. Since the total combination is large, the results were only listed in Table 5 in Section 4.2.

4.2. Fusion Results Comparison

For the feature fusion evaluation, six feature fusion algorithms which are Naïve Bayes (NB), k-Nearest Neighbors (k-NN with k = 1, 3, 5), Decision Tree (J48) and Support Vector Machine (SVM) are implemented. The comparisons of the correct rates for different 20 feature sets found in the fusion methods are shown in Table 5. The average correct rate of all the combinations’s results for all the selected feature sets is denoted as the overall correct rate of CFA for automatic gender recognition during stepping.
With reference to Table 5, it can be seen that for the feature sets selected using our method, the combinatorial fusion can result in better performance than the other six fusion algorithms. CFA generates a higher value of an average correct rate for all the 5-feature set (A), 5-feature set (B), 7-feature set (C), 9-feature set (D) and 11-feature set (E). Over all, for automatic gender recognition during the stepping exercise, CFA gives a better performance than the other six fusion methods.
From Table 5, it can be seen that when using the combinatorial fusion, the highest average correct rate is 95.49% for feature set A(5,1), 93.22% for feature set C(7,2), 91.33% for feature set D(9,3) and 91.20% for feature set E(11,4). Thus, the correct rate decreases when the number of features increases. The reason is the more features we have, the more irrelevant and redundant features will cause decreases in accuracy. However, all the selected feature sets outperformed the other six fusion algorithms.
Meanwhile, the best performance of the randomly selected 5-feature set (B) is 86.44%, produced using the 1-NN method on B(5,3) feature set. The highest performance using CFA is 80.41% which is also on the same feature set. However, the average performance of CFA is 78.64% which is lower than 86.44%. Yet, it is better than the average performance of 5-feature set (B) using 1-NN and also the highest average correct rate of the four 5-feature set (B) which is 76.27%. Therefore, in conclusion, the CFA is better compared to six fusion methods for the feature group of the 5-feature set (B).
Other than that, to evaluate the proposed method, the results of the proposed method are compared with the work done in [8] and [9] and are shown in Table 6. Barani’s and Nazarloo’s used a face biometric and Hybrid Gabor filters and binary features of an image for gender recognition. Table 6 shows that our proposed method outperformed the work done in [8,9].

5. Conclusions and Future Work

In this paper, the function of CFA [51,52] was described for both features’ selection and fusion in order to automatically recognize gender during the stepping exercise. The performance results demonstrated that the CFA provided a promising solution compared to other machine learning methods in combining relevant features from physiological biosignals. The CFA also provided a much better correct rate compared to Barani’s [8] and Nazarloo’s work [9]. Generally, when the performance and diversity between the selected two features are relatively high, a better correct rate of the feature fusion can be attained.
The main contributions of this work are the physiological biometrics used for automatic gender recognition during stepping using stepper which are EMG and HRV, a feature selection technique based on the performance and the diversity between the feature sets, feature fusion technique using CFA, a comparison of CFA with another five machine learning classifiers, and a comparison of our proposed method with previous research for gender recognition. Our work demonstrated that the CFA is a good and efficient method for the feature fusion.
In future, the HRV features will be combined with bi-channel EMG features based on CFA. In addition, the decision level combination using CFA will be considered.

Acknowledgments

The work presented in this study is funded by Ministry of Education Malaysia, Kuala Lumpur, Malaysia and Universiti Teknologi Malaysia, Kuala Lumpur, Malaysia under research grant VOTE NO: 15H81 and 13H73.

Author Contributions

Nor Aziyatul Izni Mohd Rosli and Mohd Azizi Abdul Rahman conceived and designed the experiments. Nor Aziyatul Izni Mohd Rosli and Malarvili Balakrishnan performed the experiments. Nor Aziyatul Izni analyzed the data. Mohd Azizi Abdul Rahman, Malarvili Balakrishnan, Takashi Komeda, Saiful Amri Mazlan and Hairi Zamzuri contributed data acquisition tools. Nor Aziyatul Izni Mohd Rosli and Mohd Azizi Abdul Rahman wrote the paper. All authors have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ECGElectrocardiogram
EMGElectromyogram
EEGElectroencephalogram
HRVHeart Rate Variability
HRHeart Rate
CFACombinatorial Fusion Algorithm
PDsPhysiological Devices
MRMagnetorheological
NBNaïve Bayes
SVMSupport Vector Machine
J48Decision Tree
kNNk-Nearest Neighbor
MSSMultiple Scoring Systems
BCIBrain-Computer Interaction
HMIHuman-Computer Interaction
RSCRank-Score Characteristics
SCScore Combination
RCRank Combination
VLVastus Lateralis
RRiRR Interval
DAQData Acquisition
RMSRoot Mean Square
SDStandard Deviation
VLFVery Low Frequency
LFLow Frequency
HFHigh Frequency
LBPLocal Binary Pattern
DCDirect Current

References

  1. Miller, A.E.J.; MacDougall, J.D.; Tarnopolsky, M.A.; Sale, D.G. Gender differences in strength and muscle fiber characteristics. Europ. J. Appl. Physiol. 1993, 66, 254–262. [Google Scholar] [CrossRef]
  2. Samsøe, B.D.; Bartel, E.M.; Low, P.M.B.; Lund, H.; Stockmarr, A.; Holm, C.C.; Wätjen, I.; Appleyard, M.; Bliddal, H. Isokinetic and isometric muscle strength in a healthy population with special reference ro age and gender. Acta Physiol. 2009, 197, 1–68. [Google Scholar] [CrossRef] [PubMed]
  3. Kubota, H.; Demura, S. Gender differences and laterality in maximal handgrip strength and controlled force exertion in young adults. Health 2011, 3, 684–688. [Google Scholar] [CrossRef]
  4. Ratnakar, A.; More, G. Real time gender recognition on FGPA. Int. J. Sci. Eng. Res. 2015, 6, 19–22. [Google Scholar]
  5. Tapia, J.E.; Perez, C.A. Gender classification based on fusion of different spatial scale features selected by mutual information from histogram of LBP, intensity and shape. IEEE Trans. Inf. Forensics Secur. 2013, 8, 488–499. [Google Scholar] [CrossRef]
  6. Anil, K.J.; Ross, A.; Prabhakar, S. An introduction to Biometric Recognition. IEEE Trans. Circuits Syst. Video Technol. 2004, 14, 4–20. [Google Scholar]
  7. Zahedi, M.; Yousefi, S. Gender Recognition based on sift features. Int. J. Artif. Intell. Appl. 2011, 2, 87–94. [Google Scholar]
  8. Barani, M.J.; Faez, K.; Jalili, F. Implementation of gabor filters combined with binary features for gender recognition. Int. J. Electr. Comput. Eng. 2014, 4, 108–115. [Google Scholar]
  9. Nazarloo, M.; Parcham, E.; Pourani, R.A. Gender classification using hybrid of gabor filters and binary features. Int. J. Electr. Comput. Eng. 2014, 4, 539–547. [Google Scholar] [CrossRef]
  10. Danisman, T.; Bilasco, I.M.; Martinet, J. Boosting gender recognition performance with a fuzzy inference system. Expert Syst. Appl. 2015, 42, 2772–2784. [Google Scholar] [CrossRef]
  11. Castrillón-Santana, M.; Lorenzo-Navarro, J.; Ramón-Balmaseda, E. On using periocular biometric for gender classification in the wild. Pattern Recognit. Lett. 2016, 82, 181–189. [Google Scholar] [CrossRef]
  12. Hu, M.; Wang, Y.; Zhang, Z.; Zhang, D. Gait_based gender classification using mixed conditional random field. IEEE Trans. Syst. Man Cybern. B Cybern. 2011, 41, 1429–1439. [Google Scholar] [PubMed]
  13. Ali, S.; Zhou, M.; Wu, Z.; Razzaq, A.; Hamada, M.; Ahmed, H. Comprehensive use of hip joint in gender identification using 3-dimension data. Telkomnika 2013, 11, 2933–2941. [Google Scholar] [CrossRef]
  14. Lu, J.; Wang, G.; Moulin, P. Human identity and gender recognition from gait sequences with arbitrary walking directions. IEEE Trans. Inf. Forensics Secur. 2014, 9, 51–61. [Google Scholar] [CrossRef]
  15. Das, D.; Chakrabarty, A. Human gait based gender identification system using hidden markov model and support vector machines. Int. Conf. Comput. Commun. Autom. 2015, 268–272. [Google Scholar] [CrossRef]
  16. Yuan, Y.; Pang, Y.; Li, X. Footwear for Gender Recognition. IEEE Trans. Circuits Syst. Video Technol. 2010, 20, 131–135. [Google Scholar] [CrossRef]
  17. Lin, G.S.; Zhao, Y.J. A feature-based gender recognition method based on color information. In Proceedings of the 2011 First International Conference on Robot, Vision and Signal Processing, Kaohsiung City, Taiwan, 21–23 November 2011; pp. 40–43. [Google Scholar]
  18. Rakesh, K.; Duta, S.; Shama, K. Gender Recognition using speech processing techniques in Labview. Int. J. Adv. Eng. Technol. 2011, 1, 51–63. [Google Scholar]
  19. Yücesoy, E.; Nabiyev, V.V. A new approach with score-level fusion for the classification of a speaker age and gender. Comput. Electr. Eng. 2016, 53, 29–39. [Google Scholar] [CrossRef]
  20. Nemirko, A.P.; Lugovaya, T.S. Biometric human identification based on electrocardiogram. In Proceedings of the XIIIth Russian Conference on Mathematical Methods of Pattern Recognition, Moscow, Russian, 20–26 November 2005; pp. 387–390. [Google Scholar]
  21. Bassiouni, M.; Khaleefa, W.; El-Dahshan, E.A.; Salem, A.B.M. A machine learning technique for person identification using ECG signals. Int. J. Appl. Phys. 2016, 1, 37–41. [Google Scholar]
  22. Singh, Y.N.; Singh, S.K. Evaluation of Electrocardiogram for Biometric Authentication. J. Inf. Secur. 2012, 3, 39–48. [Google Scholar] [CrossRef]
  23. Sufi, F.; Khalil, I. Faster person identification using compressed ECG in time critical wireless telecardiology applications. J. Netw. Comput. Appl. 2011, 34, 282–293. [Google Scholar] [CrossRef]
  24. Merone, M.; Soda, P.; Sansone, M.; Sansone, C. ECG databases for biometric systems: A systematic review. Expert Syst. Appl. 2016, 67, 189–202. [Google Scholar] [CrossRef]
  25. Hoekema, R.; Uijen, G.J.H.; Oosterom, A.V. Geometrical Aspects of the Inter-individual variability of multilead ECG recordings. Comput. Cardiol. 1999, 26, 499–502. [Google Scholar]
  26. Tripathy, R.K.; Acharya, A.; Choudary, S.K. Gender classification from ECG signal analysis using Least Square Support Vector Machine. Am. J. Signal Process. 2012, 2, 145–149. [Google Scholar] [CrossRef]
  27. Nazmi, N.; Rahman, M.A.A.; Mazlan, S.A.; Zamzuri, H. Electromyography (EMG) based signal analysis for physiological device application in lower limb rehabilitation. In Proceedings of the 2nd International Conference on Biomedical Engineering (ICoBe), Penang, Malaysia, 30–31 March 2015; pp. 77–82. [Google Scholar]
  28. Sung, P.S.; Lee, D.C. Gender differences in onset timing and activation of the muscles of the dominant knee during stair climbing. Knee 2009, 16, 375–380. [Google Scholar] [CrossRef] [PubMed]
  29. Zhang, D.; Wang, Y.H. Gender recognition based on fusion of face and gait information. In Proceedings of the Seventh International Conference on Machine Learning and Cybernetics, Kunming, China, 12–15 July 2008; pp. 62–67. [Google Scholar]
  30. Li, X.; Zhao, X.; Fu, Y.; Liu, Y. Bimodal gender recognition from face and fingerprint. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; pp. 2590–2597. [Google Scholar]
  31. Prabhu, G.; Poormina, S. Minimize search time through gender classification from multimodal biometrics. Procedia Comput. Sci. 2015, 50, 289–294. [Google Scholar] [CrossRef]
  32. Greene, B.R.; Boylan, G.B.; Reilly, R.B.; Chazal, P.D.; Connolly, S. Combination of EEG and ECG for improved automatic neonatal seizure detection. Clin. Neurophysiol. 2007, 118, 1348–1359. [Google Scholar] [CrossRef] [PubMed]
  33. Balakrishnan, M.; Mesbah, M. Combining newborn EEG and HRV information for automatic seizure detection. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–24 August 2008; pp. 4756–4759. [Google Scholar]
  34. Mesbah, M.; Balakrishnan, M.; Colditz, P.B.; Boashash, B. Automatic seixure detection based on the combination of newborn multi-channel EEG and HRV information. EURASIP J. Adv. Signal Process. 2012, 2012, 1–14. [Google Scholar] [CrossRef]
  35. Leeb, R.; Sagha, H.; Chavarriaga, R.; Millan, J.S.D. Multimodal fusion of muscle and brain signals for a hybrid-BCI. In Proceedings of the 32th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 4343–4346. [Google Scholar]
  36. Bermudez, T.; Lowe, D.; Arlaud-Lamborelle, A.M. EEG/ECG information fusion for epileptic event detection. In Proceedings of the IEEE 16th International Conference on Digital Signal Processing, Santorini, Greece, 5–7 July 2009; pp. 1–8. [Google Scholar]
  37. Xie, P.; Chen, X.; Ma, P.; Li, X.; Su, Y. Identification method of human movement intention based on the fusion feature of EEG and EMG. Proc. World Congr. Eng. 2013, II, 1340–1344. [Google Scholar]
  38. Yu, S.; Chen, X.; Wang, B.; Wang, X. Automatic sleep stage classification based on ECG and EEG features for day time short nap evaluation. In Proceedings of the 10th World Congress on Intelligent Control and Automation, Beijing, China, 6–8 July 2012; pp. 4974–4977. [Google Scholar]
  39. Sherwani, K.I.K.; Kumar, N. Fusion of EEG and EMG signals for gait intent detection. MMU J. Manag. Tech. 2016, 1, 50–55. [Google Scholar]
  40. Rosli, N.A.I.M.; Rahman, M.A.A.; Mazlan, S.A.; Zamzuri, H. Electrocardiographic(ECG) and Electromyography (EMG) signals fusion for physiological device in rehab application. In Proceedings of the 2014 IEEE Studdent Conference on Research and Development, Penang, Malaysia, 16–17 December 2014; pp. 1–5. [Google Scholar]
  41. Rosli, N.A.I.M.; Rahman, M.A.A.; Malarvili, M.B.; Mazlan, S.A.; Zamzuri, H. The fusion of HRV and EMG signals for automatic gender recognition during stepping exercise. TELKOMNIKA 2017, in press. [Google Scholar]
  42. Gupta, L.; Chung, B.; Srinath, M.D.; Molfese, D.L.; Kook, H. Multichannel fusion models for the parametric classification of differential brain activity. IEEE Trans. Bio-Med Eng. 2005, 52, 1869–1881. [Google Scholar] [CrossRef] [PubMed]
  43. Atrey, P.K.; Hossain, M.A.; El Saddik, A.; Kankanhalli, M.S. Multimodal fusion for multimedia analysis: A survey. Multimed. Syst. 2010, 16, 345–379. [Google Scholar] [CrossRef]
  44. Lee, P.H.; Hung, J.Y.; Hung, Y.P. Automatic gender recognition using fusion of facial strips. In Proceedings of the 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; pp. 1140–1143. [Google Scholar]
  45. Zokaee, S.; Faez, K. Human identification based on ECG and palmprint. Int. J. Electr. Comput. Eng. 2012, 2, 261–266. [Google Scholar] [CrossRef]
  46. Deng, Y.; Wu, Z.; Chu, C.H.; Zhang, Q.; Hsu, D.F. Sensor feature selection and combination for stress identification using combinatorial fusion. Int J. Adv. Robot. Syst. 2013, 10, 306–313. [Google Scholar] [CrossRef]
  47. Orphanidou, C.; Fleming, S.; Shah, S.A.; Tarassenko, L. Data fusion for estimating respiratory rate from a single-lead ECG. Biomed. Signal Process. Contr. 2013, 8, 98–105. [Google Scholar] [CrossRef]
  48. Triloka, J.; Senanayake, S.M.N.A.; Lai, D. Neural computing for walking gait pattern identification based on multi-sensor data fusion of lower limb muscles. Neural Comput. Appl. 2016, 1–13. [Google Scholar] [CrossRef]
  49. Rosli, N.A.I.M.; Rahman, M.A.A.; Mazlan, S.A.; Zamzuri, H. A review fusion of EMG and HRV biofeedback for physiologcal device interface in rehabilitation. In Recent Trends & Applications in HRV; Malarvili, M.B., Ed.; Publishing House: Johor Bahru, Malaysia, 2016; pp. 29–45. [Google Scholar]
  50. Hsu, D.F.; Taksa, I. Comparing rank and score combination methods for data fusion in information retrieval. Inf. Retr. 2005, 8, 449–480. [Google Scholar]
  51. Hsu, D.F.; Chung, Y.S.; Kristal, B.S. Combinatorial fusion analysis: Methods and practice of combining multiple scoring systems. In Advanced Data Mining Technologies in Bioinformatics; Idea Group Inc.: Calgary, AB, Canada, 2006; pp. 32–62. [Google Scholar]
  52. Hsu, D.F.; Kristal, B.S.; Schweikert, C. Rank-score characteristics (RSC) function and cognitive diversity. Brain Inform. 2010, LNAI 6334, 42–54. [Google Scholar]
  53. Hermens, H.J.; Freriks, B.; Merletti, R.; Stegeman, D.; Blok, J.; Rau, G.; Disselhorst-Klug, C.; Hagg, G. European recommendations for surface electromyography. Roessingh Res. Dev. 1999, 8, 13–54. [Google Scholar]
  54. Tarvainen, M.P.; Niskanen, J.P.; Lipponen, J.A.; Ranta-Aho, P.O.; Karjalainen, P.A. Kubios HRV-Heart Rate Variability analysis software. Comput. Methods Progr. Biomed. 2014, 113, 210–220. [Google Scholar] [CrossRef] [PubMed]
  55. Pan, J.; Tompkins, W.J. A real-time QRS detection algorithm. IEEE Trans. Biomed. Electrophysiol. 1985, 32, 230–236. [Google Scholar] [CrossRef] [PubMed]
  56. Nazmi, N.; Rahman, M.A.A.; Yamamoto, S.; Ahmad, S.A.; Zamzuri, H.; Mazlan, S.A. A review of classification techniques of EMG signals during isotonic and isometric contractions. Sensors 2016, 16, 1304. [Google Scholar] [CrossRef] [PubMed]
  57. Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. Heart Rate Variability-Standards of Measurement, Physiological Interpretation, and Clinical Use. Circulation 1996, 93, 1043–1065. [Google Scholar]
  58. Varol, H.A.; Sup, F.; Goldfarb, M. Multiclass real-time intent recognition of a powered lower limb prosthesis. IEEE Trans. Biomed. Eng. 2010, 57, 542–551. [Google Scholar] [CrossRef] [PubMed]
  59. Triloka, J.; Senanayake, S.M.N.A.; Lai, D. Neural computing for walking gait pattern identification based on multi-sensor data fusion of lower limb muscles. Neural Comput. Appl. 2016, 1–13. [Google Scholar] [CrossRef]
  60. Daud, W.M.B.W.; Yahya, A.B.; Horng, C.H.; Sulaima, M.F.; Sudirman, R. Features extraction of Electromyography signals in time domain on biceps brachii muscle. Int. J. Model. Optim. 2013, 3, 515–519. [Google Scholar] [CrossRef]
  61. Jones, A.A.; Power, G.A.; Herzoq, W. History dependence of the electromyogram: Implications for isometric steady-state EMG parameters following a lenghthening or shortening contraction. J. Electromyogr. Kinesiol. 2016, 27, 30–38. [Google Scholar] [CrossRef] [PubMed]
  62. Phinyomark, A.; Thongpanja, S.; Hu, H.; Phukpattaranont, P.; Limsakul, C. The usefullness of mean and median frequency in Electromyagorahy analysis. In Computational Intelligence in Electromyography Analysis—A Perspective on Current Applications and Future Challenges; Naik, R.G., Ed.; InTech: Rijeka, Croatia, 2012; pp. 195–220. [Google Scholar]
  63. Rokicki, L.A.; Houle, T.T.; Dhingra, L.K.; Weinland, S.R.; Urban, A.M.; Bhalla, R.K. A preliminary analysis of EMG variance as an index of change in EMG biofeedback treatment of tension-type headache. Appl. Psychophysiol. Biofeedback 2003, 28, 205–215. [Google Scholar] [CrossRef] [PubMed]
  64. Harrach, M.A.; Boudaoud, S.; Gamet, D.; Grosset, J.F.; Marin, F. Evaluation of HD-sEMG probability density function deformations in ramp exercise. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Chicago, IL, USA, 26–30 August 2014; pp. 2209–2212. [Google Scholar]
  65. Zhang, X.; Barkhaus, P.E.; Rymer, W.Z. Machine learning for supporting diagnosis of amyotrophic lateral sclerosis using suface electromyogram. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 96–103. [Google Scholar] [CrossRef] [PubMed]
  66. Batallones, A.; Sanchez, K.; Mott, B.; Coffran, C.; Hsu, D.F. On the combination of two visual cognition systems using combinatorial fusion. Brain Inform. 2015, 2, 21–32. [Google Scholar] [CrossRef] [PubMed]
  67. Schweikert, C.; Brown, S.; Tang, Z.; Smith, P.R.; Hsu, D.F. Combining multiple ChIP-seq peak detection systems using combinatorial fusion. BMC Genom. 2012, 13, S1–S12. [Google Scholar] [CrossRef] [PubMed]
  68. Lyons, D.M.; Hsu, D.F. Combining multiple scoring systems for target tracking using rank-score characteristics. Inform. Fusion 2009, 10, 124–136. [Google Scholar] [CrossRef]
  69. Lin, K.L.; Lin, C.Y.; Huang, C.D.; Chang, H.M.; Yang, C.Y.; Lin, C.T.; Tang, C.Y.; Hsu, D.F. Feature selection and combination criteria for improving accuracy in protein structure prediction. IEEE Trans. Nanobiosci. 2007, 6, 186–196. [Google Scholar] [CrossRef]
  70. Rosli, N.A.I.M.; Rahman, M.A.A.; Malarvili, M.B.; Komeda, T.; Mazlan, S.A.; Zamzuri, H. The gender effects of heart rate variability response during short-term exercise using stair stepper from statistical analysis. Indones. J. Electr. Eng. Comput. Sci. 2016, 2, 359–366. [Google Scholar] [CrossRef]
Figure 1. Conventional Stepper vs Modified stepper. (a) Conventional stepper; (b) Modified stepper (magnetorheological (MR) valve).
Figure 1. Conventional Stepper vs Modified stepper. (a) Conventional stepper; (b) Modified stepper (magnetorheological (MR) valve).
Applsci 07 00348 g001
Figure 2. The need for this research.
Figure 2. The need for this research.
Applsci 07 00348 g002
Figure 3. Gender identification process based on electromyography-heart rate variability (EMG-HRV) feature fusion. The selected features of EMG and HRV are combined using Combinatorial Fusion Approach (CFA) to obtain the gender classification result.
Figure 3. Gender identification process based on electromyography-heart rate variability (EMG-HRV) feature fusion. The selected features of EMG and HRV are combined using Combinatorial Fusion Approach (CFA) to obtain the gender classification result.
Applsci 07 00348 g003
Figure 4. The patterns of HRV and EMG.
Figure 4. The patterns of HRV and EMG.
Applsci 07 00348 g004
Figure 5. Rank-Score Characteristics (RSC) function.
Figure 5. Rank-Score Characteristics (RSC) function.
Applsci 07 00348 g005
Figure 6. Average Correct rate comparison among classifiers.
Figure 6. Average Correct rate comparison among classifiers.
Applsci 07 00348 g006
Figure 7. Single feature performance of J48 Classifier.
Figure 7. Single feature performance of J48 Classifier.
Applsci 07 00348 g007
Figure 8. Feature selection approach (FSA).
Figure 8. Feature selection approach (FSA).
Applsci 07 00348 g008
Figure 9. Feature classification using diversity and performance in two modalitites.
Figure 9. Feature classification using diversity and performance in two modalitites.
Applsci 07 00348 g009
Figure 10. Performance of the combination of 5-features sets (A group) using CFA.
Figure 10. Performance of the combination of 5-features sets (A group) using CFA.
Applsci 07 00348 g010aApplsci 07 00348 g010b
Figure 11. Performance of the combination of 5-feature sets (B group) using CFA.
Figure 11. Performance of the combination of 5-feature sets (B group) using CFA.
Applsci 07 00348 g011aApplsci 07 00348 g011b
Table 1. Most related works for gender identification.
Table 1. Most related works for gender identification.
WorkMethod
[7]• Sift features
[8]• Combination of gabor filters and binary features
[9]• Using hybrid of gabor filters and binary features
[10]• Face_based gender recognition performance with a fuzzy inference system
[11]• Using periocular biometric for gender classification in the wild
[12]• From gait_based using mixed conditional random field
[13]• Extraction of the hip joint data that was computed from Bio-vision hierarchical data
[14]• From gait sequences with arbitrary walking directions
[15]• Human gait based gender identification using Hidden Markov Model and Support Vector Machines
[16]• Using footwear appearance
[17]• Color information
[18,19]• Speech recognition
This research• Fusion of HRV and EMG signals during stepping exercise by stepper
Table 2. Most related work for the fusion of physiological signals.
Table 2. Most related work for the fusion of physiological signals.
WorkPhysiological Signals FusionApplication
ECGEMGEEG
[32,33,34] Automatic seizure detection
[35] hybrid-BCI
[36] Epileptic event detection
[37] Human movement intention
[38] Automatic sleep stage classification
[39] Gait intent detection
[40,41] and this research Gender Identification for rehabilitation exercise
Table 3. Description of the 32 extracted features.
Table 3. Description of the 32 extracted features.
CategoryNumberSymbolNameUnitsFeature Description
Time domain
HRV Feature23Amean_HRVmsMean of all RR intervals.
BSDNNmsStandard deviation of all NN intervals.
Cmean_HR1/minMean of all heart rate.
DSD_HR1/minStandard deviation of all heart rate.
ERMSSDmsThe square root of the mean of the sum of the squares of differences between adjacent NN intervals.
FNN50countNumber of pairs of adjacent NN intervals differing by more than 50 ms in the entire recording. Three variants are possible counting all such NN interval pairs or only pairs in which the first or the second interval is longer.
GpNN50%NN50 count divided by the total number of all MM intervals.
HHRV triangular index Total number of all NN intervals divided by the height of the histogram of all NN intervals measured on a discrete scale with bins of 7.8125 ms (1/128 s).
ITINNmsBaseline width of the minimum square difference triangular interpolation of the highest peak of the histogram of all NN intervals.
Frequency domain
JVLF_peakHzPeak of very low frequency.
KLF_peakHzPeak of low frequency.
LHF_peakHzPeak of high frequency.
MVLF_powerms 2 Power of very low frequency range.
NVLF_percentage%Percentage power of very low frequency range.
OLF_powerms 2 Power of low frequency range.
PLF_percentage%Percentage power of low frequency range.
QLF_normn.u.Low frequency power in normalized units.
RHF_powerms 2 Power of high frequency range.
SHF_percentage%Percentage power of high frequency range.
THF_normn.u.High frequency power in normalized units.
ULF/HFms 2 Ratio LF (ms 2 )/HF (ms 2 ).
VTPms 2 Variance of all NN intervals.
WEDRHzRespiration frequency.
EMG Feature9aEMG_meanms 2 The normalized mean of the all EMG data.
bEMG_SDms 2 Standard deviation of all EMG data.
cEMG_RMSms 2 Square root of the mean over time of the square of the vertical distance of the graph from the rest state.
dEMG_maxampms 2 Maximum amplitude of EMG.
eEMG_MDFms 2 Median frequency of EMG.
fEMG_MNFms 2 Mean frequency of EMG.
gEMG_varms 2 Variance of EMG.
hEMG_skewms 2 Skewness of EMG.
iEMG_kurtms 2 Kurtosis of EMG.
Table 4. Feature selection results.
Table 4. Feature selection results.
Feature GroupFeature SymbolsAverage PerformanceAverage Diversity
(A)
5-feature set
A(5,1)O,V,A,F,a0.812.39
A(5,2)O,V,A,b,c0.792.61
A(5,3)O,V,F,a,b0.782.58
A(5,4)O,V,A,F,b0.812.41
(B)
5-feature set
B(5,1)I,P,U,c,d0.700.10
B(5,2)E,H,R,a,h0.700.09
B(5,3)H,I,J,c,d0.710.12
B(5,4)I,Q,S,d,i0.700.10
(C)
7-feature set
C(7,1)O,V,C,G,M,f,i0.782.06
C(7,2)O,V,A,F,I,b,e0.792.10
C(7,3)O,A,F,C,G,c,e0.791.85
C(7,4)O,V,A,F,G,e,g0.792.11
(D)
9-feature set
D(9,1)V,C,G,M,I,c,e,g,i0.750.85
D(9,2)V,A,G,M,I,a,e,f,g0.750.97
D(9,3)O,V,C,G,M,a,c,e,f0.761.84
D(9,4)O,V,F,M,I,a,f,g,i0.761.92
(E)
11-feature set
E(11,1)O,V,A,F,G,M,a,b,c,e,g0.762.46
E(11,2)O,V,F,C,G,M,I,a,b,c,e0.762.16
E(11,3)O,V,A,F,G,M,I,b,c,f,g0.772.47
E(11,4)O,V,A,F,C,G,I,a,b,f,g0.772.48
Table 5. Comparison of performance among fusion method.
Table 5. Comparison of performance among fusion method.
Fusion MethodNB (%)1-NN (%)3-NN (%)5-NN (%)J48 (%)SVM (%)CFA (%)
Feature Group
(A)
5-feature set
A(5,1)81.3679.6679.6677.9786.4474.5895.49
A(5,2)81.3677.9783.0586.4484.7574.5893.53
A(5,3)76.2776.2776.2781.3684.7572.8893.73
A(5,4)83.0576.2777.9779.6684.7574.5895.27
Average80.5177.5479.2481.3685.1774.1694.51
(B)
5-feature set
B(5,1)64.4169.4969.4969.4974.5867.878.36
B(5,2)81.3677.9774.5877.9777.9779.6777.75
B(5,3)84.7586.4483.0577.9772.8874.5880.34
B(5,4)62.7171.1977.9776.2774.5867.877.85
Average73.3176.2776.2775.4375.0072.4678.58
(C)
7-feature set
C(7,1)83.0579.6686.4486.4477.9774.5891.80
C(7,2)81.3677.9774.5879.6679.6672.8893.22
C(7,3)83.0576.2779.6684.7576.2777.9792.92
C(7,4)81.3672.8876.2781.3677.9774.5892.99
Average82.2176.7079.2483.0577.9775.0092.73
(D)
9-feature set
D(9,1)84.7576.2783.0588.1472.8872.8889.50
D(9,2)79.6677.9779.6683.0572.8874.5890.28
D(9,3)79.6677.9779.6688.1477.9772.8891.33
D(9,4)81.3681.3679.6674.5876.2771.1990.49
Average81.3678.3980.5183.4875.0072.8890.40
(E)
11-feature set
E(11,1)83.0576.2783.0584.7576.2772.8890.50
E(11,2)79.6676.2784.7586.4476.2774.5890.10
E(11,3)81.3679.6681.3688.1476.2774.5888.60
E(11,4)86.4477.9783.0586.4477.9777.9791.20
Average82.6377.5483.0586.4476.7075.0090.10
Table 6. Correct Rate (%) comparison of proposed method with previous work of gender recognition.
Table 6. Correct Rate (%) comparison of proposed method with previous work of gender recognition.
ApproachCorrect Rate (%)
Proposed Method94.51
[8]90.34
[9]92.50

Share and Cite

MDPI and ACS Style

Rosli, N.A.I.M.; Rahman, M.A.A.; Balakrishnan, M.; Komeda, T.; Mazlan, S.A.; Zamzuri, H. Improved Gender Recognition during Stepping Activity for Rehab Application Using the Combinatorial Fusion Approach of EMG and HRV. Appl. Sci. 2017, 7, 348. https://doi.org/10.3390/app7040348

AMA Style

Rosli NAIM, Rahman MAA, Balakrishnan M, Komeda T, Mazlan SA, Zamzuri H. Improved Gender Recognition during Stepping Activity for Rehab Application Using the Combinatorial Fusion Approach of EMG and HRV. Applied Sciences. 2017; 7(4):348. https://doi.org/10.3390/app7040348

Chicago/Turabian Style

Rosli, Nor Aziyatul Izni Mohd, Mohd Azizi Abdul Rahman, Malarvili Balakrishnan, Takashi Komeda, Saiful Amri Mazlan, and Hairi Zamzuri. 2017. "Improved Gender Recognition during Stepping Activity for Rehab Application Using the Combinatorial Fusion Approach of EMG and HRV" Applied Sciences 7, no. 4: 348. https://doi.org/10.3390/app7040348

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop