Crosstalk in Facial EMG and Its Reduction Using ICA
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Apparatus
2.3. Procedure
2.4. EMG Recording
2.5. Data Analysis
3. Results
3.1. Original EMG Signal Analysis
3.2. Comparison of Original and ICA-Reconstructed EMG Signals
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cacioppo, J.T.; Berntson, G.G.; Klein, D.J. What is an emotion? The role of somatovisceral afference, with special emphasis on somatovisceral “illusions”. In Emotion and Social Behavior Ix; Clark, M.S., Ed.; Sage Publications: Thousand Oaks, CA, USA, 1992; pp. 63–98. [Google Scholar]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. Emotion, motivation, and anxiety: Brain mechanisms and psychophysiology. Biol. Psychiatry 1998, 44, 1248–1263. [Google Scholar] [CrossRef] [PubMed]
- Wingenbach, T.S.H. Facial EMG—Investigating the interplay of facial muscles and emotions. In Social and Affective Neuroscience of Everyday Human Interaction: From Theory to Methodology; Boggio, P.S., Wingenbach, T.S.H., da Silveira Coelho, M.L., Comfort, W.E., Marques, L.M., Alves, M.V.C., Eds.; Springer: Cham, Switzerland, 2023; pp. 283–300. [Google Scholar]
- Sato, W.; Kochiyama, T.; Yoshikawa, S. Physiological correlates of subjective emotional valence and arousal dynamics while viewing films. Biol. Psychol. 2020, 157, 107974. [Google Scholar] [CrossRef] [PubMed]
- Fernández-Dols, J.M.; Crivelli, C. Emotion and expression: Naturalistic studies. Emot. Rev. 2013, 5, 24–29. [Google Scholar] [CrossRef]
- Reisenzein, R.; Studtmann, M.; Horstmann, G. Coherence between emotion and facial expression: Evidence from laboratory experiments. Emot. Rev. 2013, 5, 16–23. [Google Scholar] [CrossRef]
- Durán, J.I.; Reisenzein, R.; Fernández-Dols, J.M. Coherence between emotions and facial expressions: A research synthesis. In The Science of Facial Expression; Fernández-Dols, J.M., Russell, J.A., Eds.; Oxford University Press: New York, NY, USA, 2017; pp. 107–129. [Google Scholar]
- Russell, J.A. Core affect and the psychological construction of emotion. Psychol. Rev. 2003, 110, 145–172. [Google Scholar] [CrossRef]
- Dimberg, U.; Thunberg, M.; Elmehed, K. Unconscious facial reactions to emotional facial expressions. Psychol. Sci. 2000, 11, 86–89. [Google Scholar] [CrossRef] [PubMed]
- Bornemann, B.; Winkielman, P.; van der Meer, E. Can you feel what you do not see? Using internal feedback to detect briefly presented emotional stimuli. Int. J. Psychophysiol. 2012, 85, 116–124. [Google Scholar] [CrossRef]
- Lang, P.J. Emotion’s response patterns: The brain and the autonomic nervous system. Emot. Rev. 2014, 6, 93–99. [Google Scholar] [CrossRef]
- Riehle, M.; Kempkensteffen, J.; Lincoln, T.M. Quantifying facial expression synchrony in face-to-face dyadic interactions: Temporal dynamics of simultaneously recorded facial EMG signals. J. Nonverbal. Behav. 2017, 41, 85–102. [Google Scholar] [CrossRef]
- Nishimura, S.; Kimata, D.; Sato, W.; Kanbara, M.; Fujimoto, Y.; Kato, H.; Hagita, N. Positive emotion amplification by representing excitement scene with TV chat agents. Sensors 2020, 20, 7330. [Google Scholar] [CrossRef]
- Sato, W.; Minemoto, K.; Ikegami, A.; Nakauma, M.; Funami, T.; Fushiki, T. Facial EMG correlates of subjective hedonic responses during food consumption. Nutrients 2020, 12, 1174. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sato, W.; Ikegami, A.; Ishihara, S.; Nakauma, M.; Funami, T.; Yoshikawa, S.; Fushiki, T. Brow and masticatory muscle activity senses subjective hedonic experiences during food consumption. Nutrients 2021, 13, 4216. [Google Scholar] [CrossRef] [PubMed]
- van Boxtel, A. Facial EMG as a tool for inferring affective states. Proc. Meas. Behav. 2010, 2010, 104–108. [Google Scholar]
- Huang, C.N.; Chen, C.H.; Chung, H.Y. The review of applications and measurements in facial electromyography. J. Med. Biol. Eng. 2005, 25, 15–20. [Google Scholar]
- Hug, F. Can muscle coordination be precisely studied by surface electromyography? J. Electromyogr. Kinesiol. 2011, 21, 1–12. [Google Scholar] [CrossRef]
- Mesin, L. Crosstalk in surface electromyogram: Literature review and some insights. Phys. Eng. Sci. Med. 2020, 43, 481–492. [Google Scholar] [CrossRef]
- Winter, D.A.; Fuglevand, A.J.; Archer, S.E. Crosstalk in surface electromyography: Theoretical and practical estimates. J. Electromyogr. Kinesiol. 1994, 4, 15–26. [Google Scholar] [CrossRef]
- Westbrook, K.E.; Nessel, T.A.; Varacallo, M. Anatomy, head and neck, facial muscles. In StatPearls; StatPearls Publishing: Tampa, FL, USA, 2022. [Google Scholar]
- Lapatki, B.G.; Stegeman, D.F.; Jonas, I.E. A surface EMG electrode for the simultaneous observation of multiple facial muscles. J. Neurosci. Methods. 2003, 123, 117–128. [Google Scholar] [CrossRef]
- Rantanen, V.; Ilves, M.; Vehkaoja, A.; Kontunen, A.; Lylykangas, L.; Makela, E.; Rautiainen, M.; Surakka, V.; Lekkala, J. A survey on the feasibility of surface EMG in facial pacing. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 1688–1691. [Google Scholar]
- Schumann, N.P.; Bongers, K.; Guntinas-Lichius, O.; Scholle, H.C. Facial muscle activation patterns in healthy male humans: A multi-channel surface EMG study. J. Neurosci. Methods 2010, 187, 120–128. [Google Scholar] [CrossRef]
- Bell, A.J.; Sejnowski, T.J. An information-maximization approach to blind separation and blind deconvolution. Neural. Comput. 1995, 7, 1129–1159. [Google Scholar] [CrossRef]
- Makeig, S.; Debener, S.; Onton, J.; Delorme, A. Mining event-related brain dynamics. Trends Cogn. Sci. 2004, 8, 204–210. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Campanini, I.; Merlo, A.; Disselhorst-Klug, C.; Mesin, L.; Muceli, S.; Merletti, R. Fundamental concepts of bipolar and high-Density surface EMG understanding and teaching for clinical, occupational, and sport applications: Origin, detection, and main errors. Sensors 2022, 22, 4150. [Google Scholar] [CrossRef] [PubMed]
- Farina, D.; Févotte, C.; Doncarli, C.; Merletti, R. Blind separation of linear instantaneous mixtures of nonstationary surface myoelectric signals. IEEE Trans. Biomed. Eng. 2004, 51, 1555–1567. [Google Scholar] [CrossRef] [PubMed]
- Kilner, J.M.; Baker, S.N.; Lemon, R.N. A novel algorithm to remove electrical cross-talk between surface EMG recordings and its application to the measurement of short-term synchronisation in humans. J. Physiol. 2002, 538, 919–930. [Google Scholar] [CrossRef] [PubMed]
- Naik, G.R.; Kumar, D.K. Applications and limitations of independent component analysis for facial and hand gesture surface electromyograms. J. Proc. R Soc. New South Wales. 2007, 140, 47–54. [Google Scholar]
- Gruebler, A.; Suzuki, K. Measurement of distal EMG signals using a wearable device for reading facial expressions. Annu Int Conf. IEEE Eng. Med. Biol. Soc. 2010, 2010, 4594–4597. [Google Scholar] [PubMed]
- Perusquía-Hernández, M.; Hirokawa, M.; Suzuki, K. A wearable device for fast and subtle spontaneous smile recognition. IEEE Trans. Affect. 2017, 8, 522–533. [Google Scholar] [CrossRef]
- Inzelberg, L.; Rand, D.; Steinberg, S.; David-Pur, M.; Hanein, Y. A wearable high-resolution facial electromyography for long term recordings in freely behaving humans. Sci. Rep. 2018, 8, 2058. [Google Scholar] [CrossRef] [Green Version]
- Faul, F.; Erdfelder, E.; Lang, A.G.; Buchner, A. G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods. 2007, 39, 175–191. [Google Scholar] [CrossRef]
- Schafer, T.; Schwarz, M.A. The meaningfulness of effect sizes in psychological research: Differences between sub-disciplines and the impact of potential biases. Front. Psychol. 2019, 10, 813. [Google Scholar] [CrossRef] [Green Version]
- Fridlund, A.J.; Cacioppo, J.T. Guidelines for human electromyographic research. Psychophysiology 1986, 23, 567–589. [Google Scholar] [CrossRef] [PubMed]
- Ishihara, S.; Nakauma, M.; Funami, T.; Tanaka, T.; Nishinari, K.; Kohyama, K. Electromyography during oral processing in relation to mechanical and sensory properties of soft gels. J. Texture Stud. 2011, 42, 254–267. [Google Scholar] [CrossRef]
- Kohyama, K.; Gao, Z.; Ishihara, S.; Funami, T.; Nishinari, K. Electromyography analysis of natural mastication behavior using varying mouthful quantities of two types of gels. Physiol. Behav. 2016, 161, 174–182. [Google Scholar] [CrossRef] [PubMed]
- Van Boxtel, A. Optimal signal bandwidth for the recording of surface EMG activity of facial, jaw, oral, and neck muscles. Psychophysiology 2001, 38, 22–34. [Google Scholar] [CrossRef] [PubMed]
- Whitmer, D.; Worrell, G.; Stead, M.; Lee, I.K.; Makeig, S. Utility of independent component analysis for interpretation of intracranial EEG. Front. Hum. Neurosci. 2010, 4, 184. [Google Scholar] [CrossRef] [Green Version]
- Farina, D.; Merletti, R.; Indino, B.; Nazzaro, M.; Pozzo, M. Surface EMG crosstalk between knee extensor muscles: Experimental and model results. Muscle Nerve. 2002, 26, 681–695. [Google Scholar] [CrossRef]
- Altimar, L.R.; Dantas, J.L.; Bigliassi, M.; Kanthack, T.F.D.; de Moraes, A.C.; Abrão, T. Influence of different strategies of treatment muscle contraction and relaxation phases on EMG signal processing and analysis during cyclic exercise. In Computational Intelligence in Electromyography Analysis—A Perspective on Current Applications and Future Challenges; Naik, G.R., Ed.; InTech: London, UK, 2012; pp. 97–116. [Google Scholar]
- Klein Breteler, M.D.; Simura, K.J.; Flanders, M. Timing of muscle activation in a hand movement sequence. Cereb Cortex. 2007, 17, 803–815. [Google Scholar] [CrossRef] [Green Version]
- Bell, A.J.; Sejnowski, T.J. Learning the higher-order structure of a natural sound. Network 1996, 7, 261–270. [Google Scholar] [CrossRef]
- JASP Team. JASP (Version 0.14.1) [Computer Software]. 2020. [Google Scholar]
- Holm, S. A simple sequentially rejective multiple test procedure. Scand. J. Stat. 1979, 6, 65–70. [Google Scholar]
- Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Erlbaum: Hillsdale, NJ, USA, 1988. [Google Scholar]
- Fay, M.P.; Proschan, M.A. Wilcoxon-Mann-Whitney or t-test? On assumptions for hypothesis tests and multiple interpretations of decision rules. Stat. Surv. 2010, 4, 1–39. [Google Scholar] [CrossRef]
- Lyubomirsky, S. Why are some people happier than others? The role of cognitive and motivational processes in well-being. Am. Psychol. 2001, 56, 239–249. [Google Scholar] [CrossRef] [PubMed]
- Meiselman, H.L. A review of the current state of emotion research in product development. Food Res. Int. 2015, 76, 192–199. [Google Scholar] [CrossRef]
- Li, S.; Scott, N.; Walters, G. Current and potential methods for measuring emotion in tourism experiences: A review. Curr. Issues Tour. 2015, 18, 805–827. [Google Scholar] [CrossRef]
- Sato, W.; Murata, K.; Uraoka, Y.; Shibata, K.; Yoshikawa, S.; Furuta, M. Emotional valence sensing using a wearable facial EMG device. Sci. Rep. 2021, 11, 5757. [Google Scholar] [CrossRef]
- Gjoreski, M.; Kiprijanovska, I.; Stankoski, S.; Mavridou, I.; Broulidakis, M.J.; Gjoreski, H.; Nduka, C. Facial EMG sensing for monitoring affect using a wearable device. Sci Rep. 2022, 12, 16876. [Google Scholar] [CrossRef]
- Grammer, K.; Schiefenhovel, W.; Schleidt, M.; Lorenz, B.; Eibl-Eibesfeldt, I. Patterns on the face: The eyebrow flash in crosscultural comparison. Ethology 1988, 77, 279–299. [Google Scholar] [CrossRef]
- Vojtech, J.M.; Cler, G.J.; Stepp, C.E. Prediction of optimal facial electromyographic sensor configurations for human-machine interface control. IEEE Trans. Neural. Syst. Rehabil. Eng. 2018, 26, 1566–1576. [Google Scholar] [CrossRef]
- Zhu, B.; Zhang, D.; Chu, Y.; Zhao, X.; Zhang, L.; Zhao, L. Face-computer interface (FCI): Intent recognition based on facial electromyography (fEMG) and online human-computer interface with audiovisual feedback. Front. Neurorobot. 2021, 15, 692562. [Google Scholar] [CrossRef]
- Kemsley, E.K.; Defernez, M.; Sprunt, J.C.; Smith, A.C. Electromyographic responses to prescribed mastication. J. Electromyogr. Kinesiol. 2003, 13, 197–207. [Google Scholar] [CrossRef]
- Funami, T.; Ishihara, S.; Kohyama, K. Use of electromyography in measuring food texture. In Food Texture Design and Optimization; Dar, Y., Light, J.M., Eds.; Wiley-Blackwell: Oxford, UK, 2014; pp. 283–307. [Google Scholar]
- Kazemeini, S.M.; Campos, D.P.; Rosenthal, A.J. Muscle activity during oral processing of sticky-cohesive foods. Physiol. Behav. 2021, 242, 113580. [Google Scholar] [CrossRef]
- Perlman, A.L.; Palmer, P.M.; McCulloch, T.M.; Vandaele, D.J. Electromyographic activity from human laryngeal, pharyngeal, and submental muscles during swallowing. J. Appl. Physiol. 1999, 86, 1663–1669. [Google Scholar] [CrossRef] [PubMed]
- Vaiman, M. Surface electromyography as a screening method for evaluation of dysphagia and odynophagia. Head Face Med. 2009, 5, 9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Balata, P.M.M.; da Silva, H.J.; de Moraes, K.J.R.; de Araújo Pernambuco, L.; de Moraes, S.R.A. Use of surface electromyography in phonation studies: An integrative review. Int. Arch. Otorhinolaryngol. 2013, 17, 329–339. [Google Scholar] [PubMed] [Green Version]
- Ryu, H.M.; Lee, S.J.; Park, E.J.; Kim, S.G.; Kim, K.H.; Choi, Y.M.; Kim, J.U.; Song, B.Y.; Kim, C.H.; Yoon, H.M.; et al. Study on the validity of surface electromyography as assessment tools for facial nerve palsy. J. Pharmacopunct. 2018, 21, 258–267. [Google Scholar] [CrossRef]
- Guntinas-Lichius, O.; Volk, G.F.; Olsen, K.D.; Makitie, A.A.; Silver, C.E.; Zafereo, M.E.; Rinaldo, A.; Randolph, G.W.; Simo, R.; Shaha, A.R.; et al. Facial nerve electrodiagnostics for patients with facial palsy: A clinical practice guideline. Eur. Arch. Otorhinolaryngol. 2020, 277, 1855–1874. [Google Scholar] [CrossRef] [Green Version]
Muscle | Statistic | Facial Action | |||||||
---|---|---|---|---|---|---|---|---|---|
Frowning | Smiling | Speaking | Chewing | Frowning + Speaking | Smiling + Speaking | Frowning + Chewing | Smiling + Chewing | ||
Corrugator | t | 5.42 | 0.13 | 0.24 | 0.08 | 4.36 | 0.95 | 3.74 | 0.87 |
p | <0.001 | 0.895 | 0.809 | 0.939 | <0.001 | 0.349 | <0.001 | 0.39 | |
d | 1.01 | 0.03 | 0.05 | 0.01 | 0.81 | 0.18 | 0.69 | 0.16 | |
Zygomatic | t | 2.27 | 5.88 | 4.47 | 4.23 | 3.86 | 5.88 | 2.41 | 5.40 |
p | 0.031 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | 0.023 | <0.001 | |
d | 0.42 | 1.09 | 0.83 | 0.79 | 0.72 | 1.09 | 0.45 | 1.00 | |
Masseter | t | 1.52 | 4.74 | 8.45 | 5.98 | 6.75 | 6.84 | 4.89 | 7.28 |
p | 0.140 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | |
d | 0.28 | 0.88 | 1.57 | 1.11 | 1.25 | 1.27 | 0.91 | 1.35 | |
Suprahyoid | t | 0.59 | 3.91 | 10.44 | 5.69 | 5.07 | 7.30 | 4.91 | 7.49 |
p | 0.558 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | |
d | 0.11 | 0.73 | 1.94 | 1.06 | 0.94 | 1.36 | 0.91 | 1.39 |
Muscle | Statistic | Facial Action | |||||||
---|---|---|---|---|---|---|---|---|---|
Frowning | Smiling | Speaking | Chewing | Frowning + Speaking | Smiling + Speaking | Frowning + Chewing | Smiling + Chewing | ||
Corrugator | t | 2.39 | 1.97 | 0.52 | 1.26 | 2.28 | 0.40 | 2.97 | 1.00 |
p | 0.024 | 0.059 | 0.606 | 0.218 | 0.031 | 0.691 | 0.006 | 0.325 | |
d | 0.44 | 0.37 | 0.10 | 0.23 | 0.42 | 0.08 | 0.55 | 0.19 | |
Zygomatic | t | 2.62 | 3.10 | 3.78 | 6.15 | 4.58 | 3.27 | 6.01 | 3.57 |
p | 0.014 | 0.004 | <0.001 | <0.001 | <0.001 | 0.003 | <0.001 | 0.001 | |
d | 0.49 | 0.58 | 0.70 | 1.14 | 0.85 | 0.61 | 1.12 | 0.66 | |
Masseter | t | 1.40 | 2.95 | 6.63 | 5.87 | 6.06 | 5.03 | 4.77 | 4.95 |
p | 0.173 | 0.006 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | |
d | 0.26 | 0.55 | 1.23 | 1.09 | 1.13 | 0.93 | 0.89 | 0.92 | |
Suprahyoid | t | 0.77 | 2.72 | 9.40 | 5.71 | 4.72 | 6.50 | 4.69 | 6.58 |
p | 0.450 | 0.011 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | <0.001 | |
d | 0.14 | 0.51 | 1.75 | 1.06 | 0.88 | 1.21 | 0.87 | 1.22 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sato, W.; Kochiyama, T. Crosstalk in Facial EMG and Its Reduction Using ICA. Sensors 2023, 23, 2720. https://doi.org/10.3390/s23052720
Sato W, Kochiyama T. Crosstalk in Facial EMG and Its Reduction Using ICA. Sensors. 2023; 23(5):2720. https://doi.org/10.3390/s23052720
Chicago/Turabian StyleSato, Wataru, and Takanori Kochiyama. 2023. "Crosstalk in Facial EMG and Its Reduction Using ICA" Sensors 23, no. 5: 2720. https://doi.org/10.3390/s23052720
APA StyleSato, W., & Kochiyama, T. (2023). Crosstalk in Facial EMG and Its Reduction Using ICA. Sensors, 23(5), 2720. https://doi.org/10.3390/s23052720