Portable Facial Expression System Based on EMG Sensors and Machine Learning Models
Abstract
:1. Introduction
- An extensive literature review is carried out to select the minimum number of sensors and samples to classify six human emotions, proving that EMG analysis is an adequate alternative in harsh environments where cameras struggle to take high-quality images of humans.
- A proper ML analysis is performed using three approaches to determine the best one with a light workload in the electronic device. Therefore, analog signals are converted into different data structures to fit ML algorithms’ training phase.
- An adequate electronic system design is presented, which combines hardware and software to reach an ML application, keeping a high classification score and less power consumption than cameras.
2. Background
2.1. Early Works on EMG in the Field of Emotion Recognition
2.2. Facial Muscles
3. Electronic Design
3.1. Sensors’ Location
3.2. Electronic System Description
4. Machine Learning Pipeline
4.1. Data Collection
4.2. Data Preprocessing
4.3. Data Preparation
4.4. Model Design
4.4.1. Supervised Classification Algorithms
- Distance-based: K-nearest neighbors (KNN).
- Model-based: Support vector machine (SVM).
- Density-based: Bayesian classifier (BC).
- Heuristic: Decision tree (DT).
4.4.2. Neural Networks
5. Results
5.1. Evaluation of ML Models
5.2. Model Optimization and Deployment
5.3. Electronic Device
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wang, M.; Lee, W.; Shu, L.; Kim, Y.S.; Park, C.H. Development and Analysis of an Origami-Based Elastomeric Actuator and Soft Gripper Control with Machine Learning and EMG Sensors. Sensors 2024, 24, 1751. [Google Scholar] [CrossRef] [PubMed]
- Donato, G.; Bartlett, M.S.; Hager, J.C.; Ekman, P.; Sejnowski, T.J. Classifying Facial Actions. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 974–989. [Google Scholar] [CrossRef] [PubMed]
- Alarcão, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2019, 10, 374–393. [Google Scholar] [CrossRef]
- Bonifati, P.; Baracca, M.; Menolotto, M.; Averta, G.; Bianchi, M. A Multi-Modal Under-Sensorized Wearable System for Optimal Kinematic and Muscular Tracking of Human Upper Limb Motion. Sensors 2023, 23, 3716. [Google Scholar] [CrossRef] [PubMed]
- Dino, H.I.; Abdulrazzaq, M.B. Facial Expression Classification Based on SVM, KNN and MLP Classifiers. In Proceedings of the 2019 International Conference on Advanced Science and Engineering (ICOASE), Zakho-Duhok, Iraq, 2–4 April 2019; pp. 70–75. [Google Scholar] [CrossRef]
- Doheny, E.P.; Goulding, C.; Flood, M.W.; Mcmanus, L.; Lowery, M.M. Feature-Based Evaluation of a Wearable Surface EMG Sensor Against Laboratory Standard EMG During Force-Varying and Fatiguing Contractions. IEEE Sens. J. 2020, 20, 2757–2765. [Google Scholar] [CrossRef]
- Degirmenci, M.; Ozdemir, M.A.; Sadighzadeh, R.; Akan, A. Emotion Recognition from EEG Signals by Using Empirical Mode Decomposition. In Proceedings of the 2018 Medical Technologies National Congress (TIPTEKNO), Magusa, Cyprus, 8–10 November 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Pham, T.D.; Duong, M.T.; Ho, Q.T.; Lee, S.; Hong, M.C. CNN-Based Facial Expression Recognition with Simultaneous Consideration of Inter-Class and Intra-Class Variations. Sensors 2023, 23, 9658. [Google Scholar] [CrossRef] [PubMed]
- Bian, Y.; Küster, D.; Liu, H.; Krumhuber, E.G. Understanding Naturalistic Facial Expressions with Deep Learning and Multimodal Large Language Models. Sensors 2024, 24, 126. [Google Scholar] [CrossRef] [PubMed]
- Borelli, G.; Jovic Bonnet, J.; Rosales Hernandez, Y.; Matsuda, K.; Damerau, J. Spectral-Distance-Based Detection of EMG Activity From Capacitive Measurements. IEEE Sens. J. 2018, 18, 8502–8509. [Google Scholar] [CrossRef]
- Song, T.; Zheng, W.; Song, P.; Cui, Z. EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks. IEEE Trans. Affect. Comput. 2020, 11, 532–541. [Google Scholar] [CrossRef]
- Ekman, P. Universal Facial Expresions of Emotion. Calif. Ment. Health Res. Dig. 1970, 8, 46. [Google Scholar] [CrossRef]
- Cai, Y.; Guo, Y.; Jiang, H.; Huang, M.C. Machine-learning approaches for recognizing muscle activities involved in facial expressions captured by multi-channels surface electromyogram. Smart Health 2018, 5, 15–25. [Google Scholar] [CrossRef]
- Chen, S.; Gao, Z.; Wang, S. Emotion recognition from peripheral physiological signals enhanced by EEG. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016; pp. 2827–2831. [Google Scholar] [CrossRef]
- Kamavuako, E.N. On the Applications of EMG Sensors and Signals. Sensors 2022, 22, 7966. [Google Scholar] [CrossRef] [PubMed]
- Thiam, P.; Kessler, V.; Amirian, M.; Bellmann, P.; Layher, G.; Zhang, Y.; Velana, M.; Gruss, S.; Walter, S.; Traue, H.C.; et al. Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database. IEEE Trans. Affect. Comput. 2021, 12, 743–760. [Google Scholar] [CrossRef]
- Perusquía-Hernández, M.; Hirokawa, M.; Suzuki, K. A Wearable Device for Fast and Subtle Spontaneous Smile Recognition. IEEE Trans. Affect. Comput. 2017, 8, 522–533. [Google Scholar] [CrossRef]
- Guendil, Z.; Lachiri, Z.; Maaoui, C.; Pruski, A. Multiresolution framework for emotion sensing in physiological signals. In Proceedings of the 2016 2nd International Conference on Advanced Technologies for Signal and Image Processing (ATSIP), Monastir, Tunisia, 21–23 March 2016; pp. 793–797. [Google Scholar] [CrossRef]
- Ghare, P.S.; Paithane, A. Human emotion recognition using non linear and non stationary EEG signal. In Proceedings of the 2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT), Pune, India, 9–10 September 2016; pp. 1013–1016. [Google Scholar] [CrossRef]
- Shin, J.; Maeng, J.; Kim, D.H. Inner Emotion Recognition Using Multi Bio-Signals. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia), JeJu, Korea, 24–26 June 2018; pp. 206–212. [Google Scholar] [CrossRef]
- Wang, X.h.; Zhang, T.; Xu, X.m.; Chen, L.; Xing, X.f.; Chen, C.L.P. EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks and Broad Learning System. In Proceedings of the 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Madrid, Spain, 3–6 December 2018; pp. 1240–1244. [Google Scholar] [CrossRef]
- Kollias, D.; Zafeiriou, S. Exploiting Multi-CNN Features in CNN-RNN Based Dimensional Emotion Recognition on the OMG in-the-Wild Dataset. IEEE Trans. Affect. Comput. 2021, 12, 595–606. [Google Scholar] [CrossRef]
- Zhao, Y.; Yang, J.; Lin, J.; Yu, D.; Cao, X. A 3D Convolutional Neural Network for Emotion Recognition based on EEG Signals. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Song, T.; Liu, S.; Zheng, W.; Zong, Y.; Cui, Z.; Li, Y.; Zhou, X. Variational Instance-Adaptive Graph for EEG Emotion Recognition. IEEE Trans. Affect. Comput. 2021, 14, 343–356. [Google Scholar] [CrossRef]
- Li, G.; Ouyang, D.; Yuan, Y.; Li, W.; Guo, Z.; Qu, X.; Green, P. An EEG Data Processing Approach for Emotion Recognition. IEEE Sens. J. 2022, 22, 10751–10763. [Google Scholar] [CrossRef]
- Jiang, M.; Rahmani, A.M.; Westerlund, T.; Liljeberg, P.; Tenhunen, H. Facial Expression Recognition with sEMG Method. In Proceedings of the 2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, Liverpool, UK, 26–28 October 2015; pp. 981–988. [Google Scholar] [CrossRef]
- Mithbavkar, S.A.; Shah, M.S. Recognition of Emotion Through Facial Expressions Using EMG Signal. In Proceedings of the 2019 International Conference on Nascent Technologies in Engineering (ICNTE), Navi Mumbai, India, 4–5 January 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Das, P.; Bhattacharyya, J.; Sen, K.; Pal, S. Assessment of Pain using Optimized Feature Set from Corrugator EMG. In Proceedings of the 2020 IEEE Applied Signal Processing Conference (ASPCON), Kolkata, India, 7–9 October 2020; pp. 349–353. [Google Scholar] [CrossRef]
- Mithbavkar, S.A.; Shah, M.S. Analysis of EMG Based Emotion Recognition for Multiple People and Emotions. In Proceedings of the 2021 IEEE 3rd Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS), Tainan, Taiwan, 28–30 May 2021; pp. 1–4. [Google Scholar] [CrossRef]
- Ang, L.; Belen, E.; Bernardo, R.; Boongaling, E.; Briones, G.; Coronel, J. Facial expression recognition through pattern analysis of facial muscle movements utilizing electromyogram sensors. In Proceedings of the 2004 IEEE Region 10 Conference TENCON 2004, Chiang Mai, Thailand, 24 November 2004; Volume C. pp. 600–603. [Google Scholar] [CrossRef]
- Gruebler, A.; Suzuki, K. A Wearable Interface for Reading Facial Expressions Based on Bioelectrical Signals. In Proceedings of the International Conference on Kansei Engineering and Emotion Research 2010 (KEER2010), Paris, France, 2–4 March 2010; p. 247. [Google Scholar]
- Inzelberg, L.; Rand, D.; Steinberg, S.; David-Pur, M.; Hanein, Y. A Wearable High-Resolution Facial Electromyography for Long Term Recordings in Freely Behaving Humans. Sci. Rep. 2018, 8, 2058. [Google Scholar] [CrossRef] [PubMed]
- Sato, W.; Murata, K.; Uraoka, Y.; Shibata, K.; Yoshikawa, S.; Furuta, M. Emotional valence sensing using a wearable facial EMG device. Sci. Rep. 2021, 11, 5757. [Google Scholar] [CrossRef]
- Preston, D.C.; Shapiro, B.E. 15-Basic Electromyography: Analysis of Motor Unit Action Potentials. In Electromyography and Neuromuscular Disorders, 3rd ed.; Preston, D.C., Shapiro, B.E., Eds.; W.B. Saunders: London, UK, 2013; pp. 235–248. [Google Scholar] [CrossRef]
- Parsaei, H.; Stashuk, D.W. EMG Signal Decomposition Using Motor Unit Potential Train Validity. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 265–274. [Google Scholar] [CrossRef]
- Ekman, P.; Rosenberg, E.L. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS); Oxford University Press: New York, NY, USA, 2012; pp. 1–672. [Google Scholar] [CrossRef]
- Lewinski, P.; Den Uyl, T.M.; Butler, C. Automated facial coding: Validation of basic emotions and FACS AUs in facereader. J. Neurosci. Psychol. Econ. 2014, 7, 227–236. [Google Scholar] [CrossRef]
- Gokcesu, K.; Ergeneci, M.; Ertan, E.; Gokcesu, H. An Adaptive Algorithm for Online Interference Cancellation in EMG Sensors. IEEE Sens. J. 2019, 19, 214–223. [Google Scholar] [CrossRef]
- Ahmed, O.; Brifcani, A. Gene Expression Classification Based on Deep Learning. In Proceedings of the 2019 4th Scientific International Conference Najaf (SICN), Al-Najef, Iraq, 29–30 April 2019; pp. 145–149. [Google Scholar] [CrossRef]
- Turgunov, A.; Zohirov, K.; Nasimov, R.; Mirzakhalilov, S. Comparative Analysis of the Results of EMG Signal Classification Based on Machine Learning Algorithms. In Proceedings of the 2021 International Conference on Information Science and Communications Technologies (ICISCT), Tashkent, Uzbekistan, 3–5 November 2021; pp. 1–4. [Google Scholar] [CrossRef]
- Hou, C.; Cai, F.; Liu, F.; Cheng, S.; Wang, H. A Method for Removing ECG Interference From Lumbar EMG Based on Signal Segmentation and SSA. IEEE Sens. J. 2022, 22, 13309–13317. [Google Scholar] [CrossRef]
- Choi, Y.; Lee, S.; Sung, M.; Park, J.; Kim, S.; Choi, Y. Development of EMG-FMG Based Prosthesis With PVDF-Film Vibrational Feedback Control. IEEE Sens. J. 2021, 21, 23597–23607. [Google Scholar] [CrossRef]
- Rosero-Montalvo, P.D.; López-Batista, V.F.; Peluffo-Ordóñez, D.H. A New Data-Preprocessing-Related Taxonomy of Sensors for IoT Applications. Information 2022, 13, 241. [Google Scholar] [CrossRef]
- Kowalski, P.; Smyk, R. Review and comparison of smoothing algorithms for one-dimensional data noise reduction. In Proceedings of the 2018 International Interdisciplinary PhD Workshop (IIPhDW), Świnouście, Poland, 9–12 May 2018; pp. 277–281. [Google Scholar] [CrossRef]
- Rosero-Montalvo, P.D.; Fuentes-Hernández, E.A.; Morocho-Cayamcela, M.E.; Sierra-Martínez, L.M.; Peluffo-Ordóñez, D.H. Addressing the Data Acquisition Paradigm in the Early Detection of Pediatric Foot Deformities. Sensors 2021, 21, 4422. [Google Scholar] [CrossRef] [PubMed]
- Ergin, T.; Ozdemir, M.A.; Akan, A. Emotion Recognition with Multi-Channel EEG Signals Using Visual Stimulus. In Proceedings of the 2019 Medical Technologies Congress (TIPTEKNO), Izmir, Turkey, 3–5 October 2019; pp. 1–4. [Google Scholar] [CrossRef]
Emotion | Muscular Basis | FACS Name |
---|---|---|
Happiness | Orbicularis oculi | Cheek raiser |
Zygomaticus major | Lip corner puller | |
Anger | Depressor glabellae | Brow lowerer |
Depressor supercilii | Upper lid raiser | |
Corrugator supercilii | Lid tightener | |
Orbicularis oculi | ||
Levator palpebrae superioris | ||
Surprise | Frontalis | Inner brow raiser |
Levator palpebrae superioris | Outer brow raiser | |
Masseter | Upper lid raiser | |
Temporal | Jaw drop | |
Fear | Frontalis | Inner brow raiser |
Orbicularis oculi | Outer brow raiser | |
Corrugator supercilii | Brow lowerer | |
Depresor supercilii | Upper lid raiser | |
Levator superioris | Lid tightener | |
Risorius | Lip stretcher | |
Masseter | Jaw drop | |
Disgust | Levator labii superioris | Nose wrinkler |
Depresor Anguli oris | Lip corner depressor | |
Levator labii Inferioris | Chin raiser | |
Mentails | ||
Sadness | Frontalis | Inner brow raiser |
Depressor Anguli oris | Brow lowerer | |
Corrugator supercilii | Lip corner depressor | |
Depresor superciliar |
Feature | Description |
---|---|
Processor | Model: Intel Core i7-6500U (9th Gen) |
Speed: 2.5 GHz | |
Cache: 4 MB Intel® Smart Cache | |
Instruction Set: 64-bit | |
Memory | Type: DDR3L-1600 |
Speed: 1600 MHz | |
Capacity: 32 GB | |
Storage | type: SATA HDD |
Speed: 5400 RPM | |
Capacity: 1 TB | |
GPU | NVIDIA Quadro M500M |
Memory: 2 GB | |
Operating System | 64-bit Windows 10 Professional Edition |
Parameters | Original | FILTERS | |||
---|---|---|---|---|---|
Signal | Media Mobile | Moving Average | Savitzky | Gaussian | |
Mean | 145.50 | 221.04 | 218.08 | 220.86 | 220.86 |
SD | 78.31 | 93.62 | 99.28 | 98.19 | 89.30 |
SNR | 1.85 | 2.36 | 2.19 | 2.24 | 2.47 |
Layer (Type) | Output Shape | Number of Parameters |
---|---|---|
Input (Dense) | (None, 100, 80) | 320 |
Layer 1 (Dense) | (None, 100, 40) | 3240 |
Layer 2 (Dense) | (None, 100, 20) | 820 |
Layer 3 (Flatten) | (None, 2000) | 0 |
Output (Dense) | (None, 6) | 12,006 |
Layer (Type) | Output Shape | Number of Parameters |
---|---|---|
Input (Dense) | (None, 100) | 30,100 |
Layer 1 (Dense) | (None, 50) | 5050 |
Layer 2 (Dense) | (None, 25) | 1275 |
Output (Dense) | (None, 6) | 156 |
ML Model | Classification Metrics | ||||
---|---|---|---|---|---|
Prec. | Rec. | F1-sco. | Error | Acc. | |
SVM | 0.9154 | 0.9166 | 0.9154 | 0.1 | 0.92 |
kNN | 0.8296 | 0.8166 | 0.8153 | 0.25 | 0.82 |
Decision tree | 0.9384 | 0.9333 | 0.9334 | 0.13 | 0.93 |
Naive Bayes | 0.9718 | 0.9666 | 0.9667 | 0.06 | 0.97 |
DP1 | 1.00 | 1.00 | 1.00 | 0.0 | 1.0 |
DP2 | 0.9436 | 0.9333 | 0.9319 | 0.06 | 0.93 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sanipatín-Díaz, P.A.; Rosero-Montalvo, P.D.; Hernandez, W. Portable Facial Expression System Based on EMG Sensors and Machine Learning Models. Sensors 2024, 24, 3350. https://doi.org/10.3390/s24113350
Sanipatín-Díaz PA, Rosero-Montalvo PD, Hernandez W. Portable Facial Expression System Based on EMG Sensors and Machine Learning Models. Sensors. 2024; 24(11):3350. https://doi.org/10.3390/s24113350
Chicago/Turabian StyleSanipatín-Díaz, Paola A., Paul D. Rosero-Montalvo, and Wilmar Hernandez. 2024. "Portable Facial Expression System Based on EMG Sensors and Machine Learning Models" Sensors 24, no. 11: 3350. https://doi.org/10.3390/s24113350