Next Article in Journal
Exploiting Fine-Grained Subcarrier Information for Device-Free Localization in Wireless Sensor Networks
Previous Article in Journal
Cross-Voting SVM Method for Multiple Vehicle Classification in Wireless Sensor Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human Motion Recognition by Textile Sensors Based on Machine Learning Algorithms

Department of Organic Materials and Fiber Engineering, Soongsil University, Seoul 156-743, Korea
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(9), 3109; https://doi.org/10.3390/s18093109
Submission received: 13 July 2018 / Revised: 5 September 2018 / Accepted: 11 September 2018 / Published: 14 September 2018
(This article belongs to the Section Physical Sensors)

Abstract

:
Wearable sensors for human physiological monitoring have attracted tremendous interest from researchers in recent years. However, most of the research involved simple trials without any significant analytical algorithms. This study provides a way of recognizing human motion by combining textile stretch sensors based on single-walled carbon nanotubes (SWCNTs) and spandex fabric (PET/SP) and machine learning algorithms in a realistic application. In the study, the performance of the system will be evaluated by identification rate and accuracy of the motion standardized. This research aims to provide a realistic motion sensing wearable product without unnecessary heavy and uncomfortable electronic devices.

1. Introduction

Wearable technology, especially wearable sensors, has become mainstream these days, and attracted great interest from researchers. By focusing on revealing the multi-dimensional aspects of human life, the wearable tech can be widely applied in medical, healthcare, power sources, flexible electronic components, etc. In the healthcare field, patients can be quickly diagnosed and treated for a variety of diseases with the help of the devices [1]. In sport, athletes’ performance is monitored in order to detect abnormalities, prepare training and tactics plans, or protect them from injuries [2]. Through special structures, wearable electronics can be applied in flexible batteries [3,4], capacitive energy storage [5], data storage [1,2,3,4,5,6], or fashion [7].
Most of the operating mechanisms of sensors are based on a relationship between some physical or chemical quantity such as temperature, pressure, stretch, light, sound, vibration, distance, humidity, pH, and electrical properties such as resistance, electromagnetism, or the capacitance of constituent conductive materials. According to this principle, a popular design approach for wearable sensors is to integrate electronic devices including temperature guage, stretch, proximity, accelerometry, and pulse-oximeter sensors into a small hard packet added on clothes, jewelry [2,8,9,10,11] or directly on the skin [12,13,14]. For example, Son et al. [1] developed bio-integrated systems for diagnosis and therapy of movement disorders. Someya et al. [14] discussed the latest progress in the use of soft electronic materials and their related devices in biological interfaces. Lee et al. [15] studied the development of skin-mounted graphene-hybrid (GP-hybrid) device arrays capable of sweat-based glucose and pH monitoring in conjunction with a sweat-control layer. Gao et al. [16] presented a mechanically flexible and fully integrated sensor array for multiplexed in-situ perspiration analysis, which simultaneously and selectively measures sweat metabolites and electrolytes. Dobkin et al. [17] used gyroscopes, accelerometers, and other physiologic sensors to monitor distance, gait asymmetry, and smoothness of human movements. Wang et al. [18] developed flexible pressure sensors based on polydimethylsiloxane (PDMS) films for monitoring physiological signals. Many other studies on stretch sensors [19,20,21], e-skins [12,13], temperature sensors [21,22], pressure sensors [2,13,23] used graphene [23,24], or carbon nanotubes (CNTs) [11,12,25] as sensing materials. The resulting sensors were effectively proven to be highly sensitive materials good for wearable devices. On the other hand, nanowires (NWs), such as gold NWs [1], Ge/Si-ZnO NWs [18], silver NWs [26,27] or compound mixtures [28] also showed positive results. However, all the above studies have at least one of the following disadvantages: hard electronic components which are inconvenient when feeling or moving, complex fabrication methods or high-cost, or a lack of suitable signal processing algorithms to apply in an actual product.
This research developed a complete combination of the wearable sensor fabrication based on single-walled carbon nanotubes (SWCNT) [11,18,28,29], spandex fabric (PET/SP), and using machine learning algorithms [30] for the analysis of sensing signals in order to apply to the real products in human motion monitoring applications [31,32,33,34,35]. The conductive polyethylene terephthalate (PET/Spandex) fabrics were prepared by padding conductive ink (SWCNT) in order to construct textile fabric sensors. The performance of the fabricated textile sensors has been characterized in terms of their mechanical and electrical performance along with stretch ratio or stretch percentage. Human motion data signals obtained through the e-textile stretch sensor are processed by a specially designed circuit, which digitizes and arranges signals into a custom format to be analyzed further. Then, the data would be transmitted via Bluetooth to the mobile phone [36,37,38,39], tablet or desktop computer in real time for display or analysis based on machine learning algorithms in order to get the best classification of four predefined standardized human motions such as walking, running, sprinting, and jumping.
Machine learning (ML) algorithms have been applied frequently to a variety of fields in medical diagnosis, natural language processing, online searches, smart cars, marketing personalization, etc. In particular, within the field of data analytics, machine learning algorithms are one of the promising methods used to devise complex models that lend themselves to high accuracy prediction and classification tasks. Some useful ML algorithms for classification have been proposed such as random forest (RD) [40], support vector machine (SVM) [41], neural network (NN) [42,43,44] and deep neural network (DNN) [45]. The performance of the developed algorithms has been evaluated in terms of mechanical properties of the sensors and the accuracy of the applied algorithms under actual and realistic wearing test conditions. It has been proved that the textile sensors are extremely thin, lightweight, sensitive, and thus highly flexible and cause no harm, irritation or allergies to the skin.
Through controlling the pressure on the squeezing machine, we obtained fabric sensors with uniform final resistances and low migration of CNT powders on PET/SP fibers after stretching. Based on that, we suggest the possibility of mass production of these fabric sensors with an easy combination of sensor fabrication and machine learning algorithm models.

2. Materials and Methods

2.1. Materials

This research used a PET/SP fabric with a polyethylene terephthalate/spandex ratio = 76/24 (item 16043A, 341 g/YD, 262 g/SQM, from SNT Co. Ltd., Seoul, South Korea). The raw powder single-walled carbon nanotubes (SWCNTs) were obtained from KH Chemical Co. Ltd. (Seoul, South Korea). These SWCNTs were treated by a laboratory grade acid solutionn. The stirring machine, ultrasonication machine, auto dipping padding machine and two-way drying machine were sourced from Daelim Starlet Co. Ltd. (Seoul, South Korea). All other electronic components such as the Bluetooth module, microprocessor, lithium battery, etc. were used as purchased.

2.2. Methods

2.2.1. Textile Sensor Fabrication

PET/SP fabrics, known for their exceptional elasticity, were prepared by co-weaving spandex with polyester. A small amount of spandex is used in the final fabric so that it may retain most of the look and feel of PET fibers. The PET/SP fabric is very resilient and can withstand a good deal of wear and tear, is waterproof and shows less wrinkling. These attributes make this spandex fabric widely applicable in industry to produce products such as clothing, household furniture, industrial fabrics, etc. The structure of the PET/SP fabric is composed of conventional PET/SP multifilament yarns with high elasticity and recovery. These fibers could be converted into conductive fibers via coating, padding, and surface treatment.
In order to fabricate the fabric sensor (Figure 1), carbon powder ink was applied by water-based single-walled carbon nanotube (SWNT) solution with nanotubes with 1.0–1.3 nm diameter and 0.1 wt.% concentration. The SWCNT powder was treated by acid solution (HNO3:H2SO4 = 3:1), dispersed in H2O, sodium dodecylbenzenesulfonate (SDBS), and ultra-sonicated (2 h, 19.990 Hz) in a stirring machine (60–80 °C, 1000 rpm, 24 h). The PET/SP fabrics were prepared and immersed in SWCNT ink within the bath of the automatic dipping padding machine. The impregnating process would maintain the conditions that allow the SWCNT particles to penetrate well (pressure roll speed: 1.0 m/min, air cylinder pressure: 3 bar (0.3 MPa) overpressure). This process would make the SWCNT particles adhere to the fabric surface after dipping and squeezing. After that, the two-way drying machine was used in order to get rid of the excess water in the fabrics. The drying conditions were optimized at the time of drying: 1–3 min, the range of temperature: 180–200 °C, and the speed of circulation fan: 1500 rpm. Finally, the fabric was maintained for 3–5 h under normal room temperature conditions. The fabric sensors were then cut to form smaller specimens for further experiments.

2.2.2. Human Motion Analysis

Actual muscle pants equipped with the fabricated textile sensor have been prepared for wearing test including motion analysis. During the test, three participants (Table 1) were asked to wear the smart muscle pants while moving. Four types of predefined motions are shown in Figure 2. The processing circuit digitized and sent motion data signals, and transmitted them via Bluetooth to a mobile phone in real time.
The method used to monitor activities is based on the relationship between the mechanical and electrical properties of the constituent conductive fabrics. Using a voltage divider circuit, the resistance variation has been converted into a voltage variation. Data based on the voltage was sampled/digitized and thus converted into the digital values. For resolution reason, mathematical mapping of voltage values between 0 to 3.7 volts into digital values between 0 to 1023 (3.7/1023 = 0.0037 V or 3.7 mV per unit) has been made by precalculating the actual data. It was calculated to take about 0.01 s (10 ms) to read an analog signal input, so the maximum reading speed is about 100 times per second. Motion data signals would be analyzed in order to generate three input parameters such as the average amplitude (AMP), standard deviation of the amplitude (STD), and the average cycle (CYC) for further processing.
  • Average amplitude: The average amplitude (AMP) is a commonly used term to indicate the magnitude of a periodic signal and determined by the ratio between the sum of the magnitudes of all instantaneous values and the number of considered instantaneous values. Considering a real signal as shown in Figure 3, A1, A2, A3, etc. are the magnitudes of the signal at instants 1, 2, 3, etc., respectively. The AMP is calculated as follows:
      AMP =   A 1 + A 2 + A 3 + + A n n  
  • Standard deviation of the amplitude: Standard deviation (STD) is a measure of the dispersion of data from its mean. It is calculated as the square root of variance by determining the variation between each data point relative to the mean. A low STD indicates that the data points tend to be close to the mean of the set data, while a high STD indicates that the data points are spread out over a wider range of values. Besides the average of amplitude, the STD evaluates the other aspect of the signal:
      S T D =   i = 1 n | A i A ¯ | 2 n 1  
    where Ai represents an individual value, A ¯ represents the mean value, and n represents the total number of values.
  • Average cycle: This is the most important parameter for the motion classification method proposed in this research. In the general fields of science and life, the cycle is defined by the shortest period in which an action is repeated. Average cycle (CYC) includes process time, during which a unit was acted upon to bring it closer to an output, and delay time, during which a unit of work was spent waiting to take the next motion. The CYC could be calculated through a threshold as shown in Figure 3.

2.2.3. Machine Learning Models

The human motion dataset consists of 400 motion samples annotated in four classes such as walking, jumping, running, and sprinting. Using the ‘cvpartition’ [46] function in the MATLAB software, we split the dataset by assigning 75% to the training set and 25% to the testing set [31,47,48]. This research considered some machine learning models such as random forest (RD), support vector machine (SVM), one-hidden layer neural network (ANN), multi-hidden layers neural network (MANN), and autoencoders neural network. The main structure of these models is shown in Figure 4. Because the target of the study is easy to construct and quick to apply in a realistic product, we implemented the models based on MATLAB 2017b software, including: RD [49], SVM [50], ANN [51], MANN [51], and AE [52]. RD constructs a multitude of decision trees during the training time. Then, the final prediction is calculated by considering the high voted result predicted by each outcome tree [53]. Figure 5 shows one tree in the RD model. Triangle nodes are used as the splitting nodes and the bold dots are decisions of this tree. SVM looks for the optimal separating hyperplane between the classes by maximizing the margin between the classes’ closest points [53]. Parameters of the implemented multiclass model for SVM are shown in the Table 2. The implemented model used SVM binary learners, and a one-versus-one coding design [50]. Artificial neural networks (ANNs) are computing systems inspired by the biological neural networks that constitute the human brain. It is composed of multiple nodes connected with coefficients (weights) which constitute the different neural structures (one-hidden layer (ANN), multi-hidden layers (MANN), etc.) in order to perform certain specific tasks [53]. AE is special neural network structure based on the efficient coding. The encoder maps the input to a hidden representation and the decoder attempts to map this representation back to the original input [53]. As shown in Figure 6, the fabricated neural networks have different numbers of hidden layers, but each hidden layer has 20 neurons. In particular, the AE model has a softmax layer in order to get four predictions in the output.

3. Results and Discussion

3.1. Structure of the Stretch Textile Sensor

Scanning electron microscopy (SEM) was employed to characterize the morphological changes of the PET/SP fabric stretch sensors at different steps of the synthesis of the conductive fabric through the present approach. Figure 7 shows SEM images of the standard PET/SP fabric with the magnified view showing no coating on the fiber and the PET/SP fabric coated with SWCNT. The figure shows the surface morphology of PET/SP fabric at high and low magnification, in its initial state and tension state (30%), respectively. The diameter of the filaments is about 10 µm and appears loosely twisted with ample free space between the microfiber bundles. The particles could be observed in the form of a thin coating, and stuck randomly onto PET/SP fibers with a 80% coating rate.
The method for recognizing specific motion is strongly based on the relationship between the mechanical and electrical properties of the sensor fabrics. The resistance would change according to stretching or releasing by a responsive crack propagation mechanism. The cracks originate and propagate in the thin conductive layers coated on the PET/SP fibers during continuous mechanical stretching. They are released under the accommodated stress at the stress-concentrated areas and recover to their initial states after releasing the stretch force imposed on the fabrics. Edges of the cracks would reconnect at this point, ensuring complete recovery of the electrical resistance. The performance of the stretch sensor will be extremely sensitive and flexible based on this mechanism.

3.2. Stretchability (Yield Point) and Sensitivity (Gauge Factor)

The stretchability of a stretch sensor depends on the material of construction used, micro/nanostructures, and the fabrication process used in the study. Figure 8a shows the resistance- stretch relationship of three sensor samples. The structure of the PET/SP fabric is one of the main reasons for the high stretchability (yield point, εy ≈ 50%) of the resistive type sensor reported in this research. If the stretch is applied beyond a certain amount (ε > 50%), the PET/SP fabric would yield and the fabric will lose its sensing capability. The stretchability ensures a wide range of stretch sensing, enough for realistic applications. As shown in Figure 8b, the sensitivity or gauge factor (GF) of the three stretch sensors is defined as the ratio of a relative change in resistance (ΔR/R) and stretch (ε), and could be written as GF = (ΔR/R)/ε and ε = ΔL/L. It is clear that the resistance increases as the stretch increases, and viceversa. For the fabricated stretch sensor, the value of GF depends mainly on the SWCNT nanostructure. The results show that the GF ranges from 4.1 to 8.5, and depends on the stretch ratio (%). Based on the calculated GF values, the stretch sensor is sensitive and suitable for the applications in this research.

3.3. Current-Voltage (I-V) Curves

The I-V curve is one of the important parameters for characterizing stretch sensors. Figure 8d shows a set of graphical curves which are used to define the operation of the sensor under different static stretches from 0–37.5% within the system. The applied voltage from −2 V to 2 V indicated the resistance of stretch sensor was constant with an Ohmic behavior. The slope of the I-V curves reduces with an increase of applied stretch, from 0%–12.5%–25% and 37.5%, indicating that an increase in applied stretch led to an increase in the sensor’s resistance.

3.4. Hysteresis

Hysteresis is defined as a behavior whose output does not only depend on the current input but also on the history of the input. The hysteresis becomes important when the stretch sensor is used in dynamic applications such as human motion monitoring, ECG monitoring, healthcare, etc. Hysteresis behaviors are mainly caused by the elastic properties of PET/SP fabric, the interaction between SWCNT and PET/SP fibers, as well as the reconnectability of the thin coatings after release of the applied stretch. Strong interfacial binding between the SWCNT nanostructures and PET/SP fibers gave the good stretch sensing performances. The hysteresis behaviors of three frequencies are shown in Figure 8c, indicating a linear rise in resistance when applying stretch and a small hysteresis.

3.5. Response and Recovery Time

Response time is the time taken to initially react to a given input. The response delay in the sensors is mainly caused by the viscoelastic nature of the PET/SP fabric. The experimental results showed a response time of 200 ms at ε = 30%. Recovery time is another important parameter of the stretch sensor in order to evaluate the performance in dynamic applications. The recovery time of this fabricated stretch sensor is 220 ms at ε = 30%. The recovery time is affected by the friction force and the reconnectability between the SWCNT coatings and the PET/SP fibers. The fast self-recovery process of the SWCNTs ensures rapid recovery of the electrical property of the stretch sensor and avoids the degradation of the device performance during large deformations.

3.6. Durability

The dynamic durability is the stable electrical functionality and mechanical integrity of the stretch sensor during its stretching/releasing cycles. This parameter depends on the fatigue and plastic deformation of the PET/SP fibers under high stress which causes damage to the fibers (PET) and the sensing nanomaterials (SWCNTs). The durability was tested in the laboratory using a customized UTM for the tension tests. The resulting fabric surface was intact after 30,000 stretching/releasing cycles, that means repeated stretch under 30% would not affect the sensor performance within 30,000 cycles. Through controlling the pressure on the squeezing machine, the uniform resistance of samples is shown in Figure 9b–d. All samples have resistance changes of less than 10% after 30,000 cycles of 30% tension. Especially, Figure 9c showed a low migration of CNT powders on PET/SP fibers during the stretch/release cycles. That result clearly reveals that the dynamic durability was enough for practical applications.

3.7. Human Motion Classification

The application capability of the fabricated stretch sensor would be evaluated by the testing on four human motions such as walking, jumping, running, and sprinting. The experimental environment of motions is in a straight corridor. Participants was asked to wear the smart muscle pants while moving in the straight corridor (a total length of 50 m). Characteristics of motions are shown in Table 3, included: velocity (m/s), step size (m), and frequency (Hz).
The result of the experiments are shown in the comparison between the output of the system and the actual motion. Statistical indices, including percent accuracy and confusion matrix are two important elements to evaluate the computational efficiency of this research. The accuracy is determined by Equation (3):
  A = T P + T N T P + T N + F P + F N  
where TP, TN, FP, and FN represent the number of true positives, true negatives, false positives, and false negatives, respectively. In Figure 10, the five algorithms obtained a mean performance accuracy of 90% with the random forest, 84% with the support vector machine, 85% with one hidden layer neural network, 88% with multi-hidden layers neural network, and 87% with autoencoder. The accuracy of the random forest algorithms (90%) shows that there is a good agreement between the measured and classified values.
The confusion matrix or error matrix is a specific parameter in the field of statistical classification of machine learning. This is a specific table layout that allows visualization of the performance of algorithms. Each row of the matrix represents the instances in a classified value while each column represents the instances in an actual value. The confusion matrices of the algorithms are shown in Figure 11, Figure 12, Figure 13, Figure 14 and Figure 15. Here number 1 represents walking, number 2 represents jumping, number 3 represents running and number 4 represents sprinting. The diagonal elements represent the number of cases for which the classified motion is equal to the actual motion, while off-diagonal elements are those that are mislabeled. The higher the diagonal values of the confusion matrix the better, it indicates many correct classifications. Accordingly, it is clear that the walking and sprinting motions are easy to classify with all algorithms such as RD(100–88.9%), SVM(100–100%), ANN(100–95.5%), M-ANN(100–92.3%), and AE(100–95.8%), respectively.
However, the jumping and running motions are easy to confusing such as RD(88–82.6%), SVM(77.3–64.5%), ANN(82.6–66.7%), M-ANN(83.3–76%), and AE(80–73.1%). Especially, the precision of the running motion in the support vector machine and the one hidden neural network algorithms are lowest. From the matrices, the running motion was confused with the jumping motion in SVM model (25.8%), ANN and M-ANN models (20%). The ratio of this confusion is still high in AE model (19%). The best ratio is 13% of the RD model. Besides that, the jumping motion was easy confused with the running motion in SVM model (22.7%) and AE model (20%). This ratio in the ANN model and M-ANN model are almost same (16–17%).
The research has demonstrated that the random forest, multi-hidden layers neural network and the autoencoders algorithms were superior to the support vector machine and the one hidden layer neural network in terms of classification accuracy in this realistic application. The results obtained from the random forest and the multi-hidden layers neural network algorithms were similar in terms of classification rate, and the random forest was marginally better than the multi-hidden layers neural network. Based on that result, we suggest the RD model be applied in real applications for mass production. The main reason is the high accuracy of the RD model. In addition, the algorithm of the RD model is easy to understand and it is supported in the MATLAB tools.

4. Conclusions

This research has developed a complete combination of the wearable application based on SWCNT-PET/SP and machine learning models to analyze sensing signals from a real product. The research emphasized the possibility to bring the product from experimental concept to daily life with high economic efficiency, simply and quickly. The fabrication process of the stretchable and flexible stretch sensor is simple and the performance of the monitoring model was enhanced by machine learning algorithms. Based on the statistical indices, the high accuracy demonstrated that this system could be applied as an intelligent device for recognizing human motions in real time. However, the research still has some limitations. The high variation of the sensor response causes bias in the final results. We suggest using vacuum drying in the fabrication of the sensors. This process can create a strong connection between SWCNTs and PET/SP fibers. These are future directions of the project.

Author Contributions

J.K. and C.C.V. conceived and designed the experiments; J.K. and C.C.V. designed the prototype, performed the experiments and analyzed the data; C.C.V. wrote the paper.

Acknowledgments

This research was supported by the KIAT(Korea Institute for Advancement of Technology) grant funded by the Korea Government(MOTIE : Ministry of Trade Industry and Energy). (No. P0002397, HRD program for Industrial Convergence of Wearable Smart Devices).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Son, D.; Lee, J.; Qiao, S.; Ghaffari, R.; Kim, J.; Lee, J.E.; Song, C.; Kim, S.J.; Lee, D.J.; Jun, S.W.; et al. Multifunctional wearable devices for diagnosis and therapy of movement disorders. Nat. Nanotechnol. 2014, 9, 397–404. [Google Scholar] [CrossRef] [PubMed]
  2. Majumder, S.; Mondal, T.; Deen, M.J. Wearable Sensors for Remote Health Monitoring. Sensors 2017, 17, 130. [Google Scholar] [CrossRef] [PubMed]
  3. Zamarayeva, M.A.; Ostfeld, A.E.; Wang, M.; Duey, J.K.; Deckman, I.; Lechêne, B.P.; Davies, G.; Steingart, D.A.; Arias, A.C. Flexible and stretchable power sources for wearable electronics. Sci. Adv. 2017, 3, 1602051. [Google Scholar] [CrossRef] [PubMed]
  4. Wen, Z.; Yeh, M.-H.; Guo, H.; Wang, J.; Zi, Y.; Xu, W.; Deng, J.; Zhu, L.; Wang, X.; Hu, C.; et al. Self-powered textile for wearable electronics by hybridizing fiber-shaped nanogenerators, solar cells, and supercapacitors. Sci. Adv. 2016, 2, 1600097. [Google Scholar] [CrossRef] [PubMed]
  5. Yu, D.; Goh, K.; Wang, H.; Wei, L.; Jiang, W.; Zhang, Q.; Dai, L.; Chen, Y. Scalable synthesis of hierarchically structured carbon nanotube—graphene fibres for capacitive energy storage. Nat. Nanotechnol. 2014, 9, 555–562. [Google Scholar] [CrossRef] [PubMed]
  6. Geier, M.L.; McMorrow, J.J.; Xu, W.; Zhu, J.; Kim, C.H.; Marks, T.J.; Hersam, M.C. Solution-processed carbon nanotube thin-film complementary static random access memory. Nat. Nanotechnol. 2015, 10, 944–948. [Google Scholar] [CrossRef] [PubMed]
  7. Choi, S.; Kwon, S.; Kim, H.; Kim, W.; Kwon, J.H.; Lim, M.S.; Lee, H.S.; Choi, K.C. Highly Flexible and Efficient Fabric-Based Organic Light-Emitting Devices for Clothing-Shaped Wearable Displays. Sci. Rep. 2017, 7, 6424. [Google Scholar] [CrossRef] [PubMed]
  8. Servati, A.; Zou, L.; Wang, J.Z.; Ko, F.; Servati, P. Novel Flexible Wearable Sensor Materials and Signal Processing for Vital Sign and Human Activity Monitoring. Sensors 2017, 17, 1622. [Google Scholar] [CrossRef] [PubMed]
  9. Wang, C.; Li, X.; Gao, E.; Jian, M.; Xia, K.; Wang, Q.; Xu, Z.; Ren, T.; Zhang, Y. Carbonized Silk Fabric for Ultrastretchable, Highly Sensitive, and Wearable Strain Sensors. Adv. Mater. 2016, 28, 6640–6648. [Google Scholar] [CrossRef] [PubMed]
  10. Wang, Y.; Wang, L.; Yang, T.; Li, X.; Zang, X.; Zhu, M.; Wang, K.; Wu, D.; Zhu, H. Wearable and Highly Sensitive Graphene Strain Sensors for Human Motion Monitoring. Adv. Funct. Mater. 2014, 24, 4666–4670. [Google Scholar] [CrossRef]
  11. Stoppa, M.; Chiolerio, A. Wearable Electronics and Smart Textiles: A Critical Review. Sensors 2014, 14, 11957–11992. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Hammock, L.M.; Chortos, A.; Tee, C.K.B.; Tok, B.H.J.; Bao, Z. 25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress. Adv. Mater. 2013, 25, 5997–6038. [Google Scholar] [CrossRef] [PubMed]
  13. Ho, D.H.; Sun, Q.; Kim, S.Y.; Han, J.T.; Kim, D.H.; Cho, J.H. Stretchable and Multimodal All Graphene Electronic Skin. Adv. Mater. 2016, 28, 2601–2608. [Google Scholar] [CrossRef] [PubMed]
  14. Someya, T.; Bao, Z.; Malliaras, G.G. The rise of plastic bioelectronics. Nature 2016, 540, 379–385. [Google Scholar] [CrossRef] [PubMed]
  15. Lee, H.; Choi, T.K.; Lee, Y.B.; Cho, H.R.; Ghaffari, R.; Wang, L.; Choi, H.J.; Chung, T.D.; Lu, N.; Hyeon, T.; Choi, S.H.; Kim, D.-H. A graphene-based electrochemical device with thermoresponsive microneedles for diabetes monitoring and therapy. Nat. Nanotechnol. 2016, 11, 566–572. [Google Scholar] [CrossRef] [PubMed]
  16. Gao, W.; Emaminejad, S.; Nyein, H.Y.Y.; Challa, S.; Chen, K.; Peck, A.; Fahad, H.M.; Ota, H.; Shiraki, H.; Kiriya, D.; et al. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis. Nature 2016, 529, 509–514. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Dobkin, H.B. Wearable motion sensors to continuously measure real-world physical activities. Curr. Opin. Neurol. 2013, 26, 602–608. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Wang, X.; Gu, Y.; Xiong, Z.; Cui, Z.; Zhang, T. Silk-Molded Flexible, Ultrasensitive, and Highly Stable Electronic Skin for Monitoring Human Physiological Signals. Adv. Mater. 2014, 26, 1336–1342. [Google Scholar] [CrossRef] [PubMed]
  19. Sarwar, M.S.; Dobashi, Y.; Preston, C.; Wyss, J.K.M.; Mirabbasi, S.; Madden, J.D.W. Bend, stretch, and touch: Locating a finger on an actively deformed transparent sensor array. Sci. Adv. 2017, 3, 1602200. [Google Scholar] [CrossRef] [PubMed]
  20. Pang, C.; Lee, G.-Y.; Kim, T.; Kim, S.M.; Kim, H.N.; Ahn, S.-H.; Suh, K.-Y. A flexible and highly sensitive strain-gauge sensor using reversible interlocking of nanofibers. Nat. Mater. 2012, 11, 795–801. [Google Scholar] [CrossRef] [PubMed]
  21. Amjadi, M.; Kyung, K.; Park, I.; Sitti, M. Stretchable, Skin-Mountable, and Wearable Strain Sensors and Their Potential Applications: A Review. Adv. Funct. Mater. 2016, 26, 1678–1698. [Google Scholar] [CrossRef]
  22. Hsu, P.-C.; Liu, C.; Song, A.Y.; Zhang, Z.; Peng, Y.; Xie, J.; Liu, K.; Wu, C.-L.; Catrysse, P.B.; Cai, L.; et al. A dual-mode textile for human body radiative heating and cooling. Sci. Adv. 2017, 3, 1700895. [Google Scholar] [CrossRef] [PubMed]
  23. Mao, C.; Zhang, H.; Lu, Z. Flexible and wearable electronic silk fabrics for human physiological monitoring. Smart Mater. Struct. 2017, 26, 095033. [Google Scholar] [CrossRef]
  24. Ren, J.; Wang, C.; Zhang, X.; Carey, T.; Chen, K.; Yin, Y.; Torrisi, F. Environmentally-friendly conductive cotton fabric as flexible strain sensor based on hot press reduced graphene oxide. Carbon 2017, 111, 622–630. [Google Scholar] [CrossRef]
  25. Wang, L.; Loh, J.K. Wearable carbon nanotube-based fabric sensors for monitoring human physiological performance. Smart Mater. Struct. 2017, 26, 055018. [Google Scholar] [CrossRef] [Green Version]
  26. Kim, K.K.; Hong, S.; Cho, H.M.; Lee, J.; Suh, Y.D.; Ham, J.; Ko, S.H. Highly Sensitive and Stretchable Multidimensional Strain Sensor with Prestrained Anisotropic Metal Nanowire Percolation Networks. Nano Lett. 2015, 15, 5240–5247. [Google Scholar] [CrossRef] [PubMed]
  27. Amjadi, M.; Pichitpajongkit, A.; Lee, S.; Ryu, S.; Park, I. Highly Stretchable and Sensitive Strain Sensor Based on Silver Nanowire–Elastomer Nanocomposite. ACS Nano 2014, 8, 5154–5163. [Google Scholar] [CrossRef] [PubMed]
  28. Guo, X.; Huang, Y.; Zhao, Y.; Mao, L.; Gao, L.; Pan, W.; Zhang, Y.; Liu, P. Highly stretchable strain sensor based on SWCNTs/CB synergistic conductive network for wearable human-activity monitoring and recognition. Smart Mater. Struct. 2017, 26, 095017. [Google Scholar] [CrossRef]
  29. Roman, C.; Helbling, T.; Hierold, C. Single-Walled Carbon Nanotube Sensor Concepts. In Springer Handbook of Nanotechnology, 3rd ed.; Bhushan, B., Ed.; Springer: Berlin, Germany, 2010; pp. 403–425. [Google Scholar]
  30. Bengio, Y.; Courville, A.; Vincent, P. Representation Learning: A Review and New Perspectives. IEEE Trans. Pattern. Anal. Mach. Intell. 2013, 35, 1798–1828. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Lara, D.O.; Labrador, A.M. A Survey on Human Activity Recognition using Wearable Sensors. IEEE Commun. Surv. Tutor. 2012, 15, 1192–1209. [Google Scholar] [CrossRef]
  32. Lopez, N.H.I.; Munoz, M.A. Wearable Inertial Sensors for Human Motion Analysis: A Review. IEEE Sens. J. 2016, 16, 7821–7834. [Google Scholar] [CrossRef]
  33. Guo, F.M.; Cui, X.; Wang, K.L.; Wei, J.Q. Stretchable and compressible strain sensors based on carbon nanotube meshes. Nanoscale 2016, 8, 19352–19358. [Google Scholar] [CrossRef] [PubMed]
  34. Jiao, Y.; Young, C.W.; Yang, S.; Oren, S.; Ceylan, H.; Kim, S.; Gopalakrishnan, K.; Taylor, P.C.; Dong, L. Wearable Graphene Sensors with Microfluidic Liquid Metal Wiring for Structural Health Monitoring and Human Body Motion Sensing. IEEE Sens. J. 2016, 16, 7870–7875. [Google Scholar] [CrossRef]
  35. Chetty, G.; White, M. Body sensor networks for human activity recognition. In Proceedings of the 3rd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 11–12 February 2016; pp. 660–665. [Google Scholar]
  36. He, Z.; Bai, X. A wearable wireless body area network for human activity recognition. In Proceedings of the 2014 Sixth International Conference on Ubiquitous and Future Networks (ICUFN), Shanghai, China, 8–11 July 2014; pp. 115–119. [Google Scholar]
  37. Sairam, K.V.S.S.S.S.; Gunasekaran, N.; Redd, S.R. Bluetooth in wireless communication. IEEE. Commun. Mag. 2002, 40, 90–96. [Google Scholar] [CrossRef]
  38. Schilingovski, P.; Vulfin, V.; Sayfan, A.S.; Shavit, R. Wearable antennas design for wireless communication. In Proceedings of the 2017 IEEE International Conference on Microwaves, Antennas, Communications and Electronic Systems (COMCAS), Tel-Aviv, Israel, 13–15 November 2017; pp. 1–3. [Google Scholar]
  39. Akyildiz, I.F.; Su, W.; Sankarasubramaniam, Y.; Cayirci, E. A survey on sensor networks. IEEE Commun. Mag. 2002, 40, 102–114. [Google Scholar] [CrossRef]
  40. Biau, G.J. Analysis of a random forests model. Mach. Learn. Res. 2012, 13, 1063–1095. [Google Scholar]
  41. Brereton, G.R.; Lloyda, R.G. Support Vector Machines for classification and regression. Analyst 2010, 135, 230–267. [Google Scholar] [CrossRef] [PubMed]
  42. Dias, M.F.; Antunes, A.; Mota, M.A. Artificial neural networks: A review of commercial hardware. Eng. Appl. Artif. Intell. 2004, 17, 945–952. [Google Scholar] [CrossRef]
  43. Granitto, P.M.; Verdes, P.F.; Ceccatto, H.A. Neural network ensembles: Evaluation of aggregation algorithms. Artif. Intell. 2005, 163, 139–162. [Google Scholar] [CrossRef]
  44. Ojha, K.V.; Abraham, A.; Snasel, V. Metaheuristic design of feedforward neural networks: A review of two decades of research. Eng. Appl. Artif. Intell. 2017, 60, 97–116. [Google Scholar] [CrossRef] [Green Version]
  45. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. cvpartition. Available online: https://www.mathworks.com/help/stats/cvpartition.html (accessed on 2 September 2018).
  47. Karpathy, A.; Toderici, G.; Shetty, S.; Leung, T.; Sukthankar, R.; Fei-Fei, L. Large-scale video classification with convolutional neural networks. In Proceedings of the 2014 IEEE conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1725–1732. [Google Scholar]
  48. Sebastiani, F. Machine learning in automated text categorization. ACM Comput. Surv. 2002, 34, 1–47. [Google Scholar] [CrossRef] [Green Version]
  49. TreeBagger. Available online: https://www.mathworks.com/help/stats/treebagger.html (accessed on 31 July 2018).
  50. fitcecoc. Available online: https://www.mathworks.com/help/stats/fitcecoc.html (accessed on 31 July 2018).
  51. Neural Network Toolbox. Available online: https://www.mathworks.com/help/nnet/index.html (accessed on 31 July 2018).
  52. Autoencoder class. Available online: https://www.mathworks.com/help/nnet/ref/autoencoder-class.html (accessed on 31 July 2018).
  53. Shalev-Shwartz, S.; Ben-David, S. Understanding Machine Learning: From Theory to Algorithms, 1st ed.; Cambridge University Press: New York, NY, USA, 2014; pp. 202–283. [Google Scholar]
Figure 1. Summary of the fabrication process and application of SWCNT stretch sensors.
Figure 1. Summary of the fabrication process and application of SWCNT stretch sensors.
Sensors 18 03109 g001
Figure 2. Types of human motion signals.
Figure 2. Types of human motion signals.
Sensors 18 03109 g002
Figure 3. The parameters of the signal.
Figure 3. The parameters of the signal.
Sensors 18 03109 g003
Figure 4. Structure of the models.
Figure 4. Structure of the models.
Sensors 18 03109 g004
Figure 5. The graphical display of one tree in the RD model.
Figure 5. The graphical display of one tree in the RD model.
Sensors 18 03109 g005
Figure 6. Structure of the neural network models: (a) One hidden layer, (b) Multi-hidden layers, and (c) Autoencoder.
Figure 6. Structure of the neural network models: (a) One hidden layer, (b) Multi-hidden layers, and (c) Autoencoder.
Sensors 18 03109 g006
Figure 7. Surfaces of the fabricated sensors: (a) untreated, (b) treated, (c) treated and stretched (under lower magnification), (d) untreated, (e) treated, and (f) treated and stretched (higher magnification).
Figure 7. Surfaces of the fabricated sensors: (a) untreated, (b) treated, (c) treated and stretched (under lower magnification), (d) untreated, (e) treated, and (f) treated and stretched (higher magnification).
Sensors 18 03109 g007
Figure 8. Characteristics of the sensors (I): (a) Stretch-ability, (b) Gauge factor, (c) Hysteresis, and (d) Current-Voltage curves.
Figure 8. Characteristics of the sensors (I): (a) Stretch-ability, (b) Gauge factor, (c) Hysteresis, and (d) Current-Voltage curves.
Sensors 18 03109 g008
Figure 9. Characteristics of the sensors (II): (a) response-recovery time; (b) resistance of sensors after 30,000 cycles of dynamic tension test (30%); (c) stretch-resistance of 700 cycles of dynamic tension test (30%), and (d) resistance of the sensors in five different samples.
Figure 9. Characteristics of the sensors (II): (a) response-recovery time; (b) resistance of sensors after 30,000 cycles of dynamic tension test (30%); (c) stretch-resistance of 700 cycles of dynamic tension test (30%), and (d) resistance of the sensors in five different samples.
Sensors 18 03109 g009
Figure 10. Comparison of the correct classification motions of the models (RD, SVM, ANN, M-ANN, AE) with actual motions.
Figure 10. Comparison of the correct classification motions of the models (RD, SVM, ANN, M-ANN, AE) with actual motions.
Sensors 18 03109 g010
Figure 11. The accuracy of the random forest algorithm.
Figure 11. The accuracy of the random forest algorithm.
Sensors 18 03109 g011
Figure 12. The accuracy of the support vector machine algorithm.
Figure 12. The accuracy of the support vector machine algorithm.
Sensors 18 03109 g012
Figure 13. The accuracy of the one-hidden layer neural network algorithm.
Figure 13. The accuracy of the one-hidden layer neural network algorithm.
Sensors 18 03109 g013
Figure 14. The accuracy of the multi-hidden layers neural network algorithm.
Figure 14. The accuracy of the multi-hidden layers neural network algorithm.
Sensors 18 03109 g014
Figure 15. The accuracy of the autoencoders algorithm.
Figure 15. The accuracy of the autoencoders algorithm.
Sensors 18 03109 g015
Table 1. Information about participants.
Table 1. Information about participants.
Age (year)GenderWeight (kg)Height (m)
28Male551.67
26Male621.70
32Male651.72
Table 2. Parameters of the multiclass model for support vector machines.
Table 2. Parameters of the multiclass model for support vector machines.
NameCharacteristic
Response Name‘Y’ (Output)
Categorical Predictors[none]
Class Names[‘Walking’ ‘Jumping’ ‘Running’ ‘Sprinting’]
Score Transform‘none’
Binary Learners{6 × 1 cell}
Coding Name‘onevsone’
Table 3. Characteristics of motions.
Table 3. Characteristics of motions.
CharacteristicVelocity (m/s)Step Size (m)Frequency (Hz)
Walking1.20.351.7
Running3.20.452.4
Sprinting5.00.73.0
Jumping1.50.752.0

Share and Cite

MDPI and ACS Style

Vu, C.C.; Kim, J. Human Motion Recognition by Textile Sensors Based on Machine Learning Algorithms. Sensors 2018, 18, 3109. https://doi.org/10.3390/s18093109

AMA Style

Vu CC, Kim J. Human Motion Recognition by Textile Sensors Based on Machine Learning Algorithms. Sensors. 2018; 18(9):3109. https://doi.org/10.3390/s18093109

Chicago/Turabian Style

Vu, Chi Cuong, and Jooyong Kim. 2018. "Human Motion Recognition by Textile Sensors Based on Machine Learning Algorithms" Sensors 18, no. 9: 3109. https://doi.org/10.3390/s18093109

APA Style

Vu, C. C., & Kim, J. (2018). Human Motion Recognition by Textile Sensors Based on Machine Learning Algorithms. Sensors, 18(9), 3109. https://doi.org/10.3390/s18093109

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop