An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network
Abstract
:1. Introduction
- Construct an effective dataset consisting of the robot’s emotional body movements and corresponding emotional states and explore the trends of user perceptions for different emotions.
- Propose a computational model to build the mapping between the robot’s emotional body movements and corresponding emotional states and achieve emotion recognition of the robot’s body movements.
- Verify the effectiveness of the proposed model with experiments on emotion recognition for the emotions of a robot’s body movements and find a feasible approach to optimizing the recognition rate.
2. Related Works
2.1. Design of Robot Emotional Body Movements and Corresponding Perceptions of Users in HRI
2.2. The Neural Network Used for Emotion Recognition in HRI
3. Design of Robot’s Emotional Body Movements
3.1. Materials
3.2. Subjective Evaluation for User’s Perception of Emotion
3.3. Results and Analysis
4. Emotion Recognition Model Using BP Neural Network for Robot’s Body Movement
4.1. The Neural Network Topological Architecture of the Proposed Model
4.2. Encoding Rules
4.3. The Proposed Emotional Body Movement Recognition Model Using BP Neural Network
5. The Optimization for Proposed Model Using PSO and RMSProp
6. Experiments and Comparisons
7. Conclusions
- Positive correlations between the emotion recognition rates of robot body movements and the intensities of their emotional expressions were found. Moreover, the impacts on the emotion recognition rates differed for different emotions expressed by the robot’s body movements. The happiness recognition rates of robot body movements were greatly affected by the happiness expressions. Meanwhile, the influence of the robot’s negative emotion-related expressions on negative emotion recognition rates was lower than that of the happiness recognition expressions on the happiness recognition rates. Compared with other emotions, the expression of surprise showed the smallest relative effect on the recognition rates of surprise.
- A dataset consisting of 25 robot emotional body movements and corresponding emotional states was designed, and it can be used as research material in future work focused on HRI.
- The EBMR-BP model, which provides a feasible and effective approach to recognizing the emotions associated with a humanoid robot’s body movements, was proposed. The topological architecture, the encoding rule, and other aspects were described in detail. It allows for a mapping between a robot’s emotional body movements and human emotions to be successfully built using a computational model.
- Optimization of the EBMR-BP model using a combination of the PSO method and the RMSProp algorithm was proposed to illustrate the feasible optimization path and verify the improvement in the emotion recognition ability of the robot’s emotional body movements. It also provides a feasible approach to achieve a relatively higher emotion recognition rate associated with body movements (88.89%) with a relatively small amount of data.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Błażejowska, G.; Gruba, Ł.; Indurkhya, B.; Gunia, A. A Study on the Role of Affective Feedback in Robot-Assisted Learning. Sensors 2023, 23, 1181. [Google Scholar] [CrossRef] [PubMed]
- Han, H.; Karadeniz, O.; Dalyan, T.; Sonmez, E.B.; Sarioglu, B. Acknowledge of Emotions for Improving Student-Robot Interaction. Intell. Autom. Soft Comput. 2023, 37, 1209–1224. [Google Scholar] [CrossRef]
- James, J.; Balamurali, B.T.; Watson, C.I.; MacDonald, B. Empathetic Speech Synthesis and Testing for Healthcare Robots. Int. J. Soc. Robot. 2021, 13, 2119–2137. [Google Scholar] [CrossRef]
- Spekman, M.L.C.; Konijn, E.A.; Hoorn, J.F. How Physical Presence Overrides Emotional (Coping) Effects in HRI: Testing the Transfer of Emotions and Emotional Coping in Interaction with a Humanoid Social Robot. Int. J. Soc. Robot. 2021, 13, 407–428. [Google Scholar] [CrossRef]
- Lee, M.; Lee, J.H.; Lee, M. RU-FEMOIN—A Database of Facial Expressions: Utilization for Robotic Mood Transition of Patient Robots. Expert Syst. Appl. 2023, 234, 120907. [Google Scholar] [CrossRef]
- Chen, X.; Sun, S.; Zhang, Z.; Ma, Z.; Wu, X.; Li, H.; Chen, T.; Zhang, K. Consumer Shopping Emotion and Interest Database: A Unique Database with a Multimodal Emotion Recognition Method for Retail Service Robots to Infer Consumer Shopping Intentions Better Than Humans. J. Electron. Imaging 2022, 31, 061807. [Google Scholar] [CrossRef]
- Yoo, S.R.; Kim, S.H.; Jeon, H.M. How Does Experiential Value Toward Robot Barista Service Affect Emotions, Storytelling, and Behavioral Intention in the Context of COVID-19? Sustainability 2022, 14, 450. [Google Scholar] [CrossRef]
- Yang, K.; Wang, H. The Application of Interactive Humanoid Robots in the History Education of Museums Under Artificial Intelligence. Int. J. Hum. Robot. 2023, 20, 2250016. [Google Scholar] [CrossRef]
- Cameron, D.; Millings, A.; Fernando, S.; Collins, E.C.; Moore, R.; Sharkey, A.; Evers, V.; Prescott, T. The Effects of Robot Facial Emotional Expressions and Gender on Child–Robot Interaction in a Field Study. Connect. Sci. 2018, 30, 343–361. [Google Scholar] [CrossRef]
- Hashimoto, M.; Kondo, H.; Tamatsu, Y. Gaze Guidance Using a Facial Expression Robot. Adv. Robot. 2009, 23, 1831–1848. [Google Scholar] [CrossRef]
- Val-Calvo, M.; Álvarez-Sánchez, J.R.; Ferrández-Vicente, J.M.; Fernandez, E. Affective Robot Story-Telling Human-Robot Interaction: Exploratory Real-Time Emotion Estimation Analysis Using Facial Expressions and Physiological Signals. IEEE Access 2020, 8, 134051–134066. [Google Scholar] [CrossRef]
- Filippini, C.; Perpetuini, D.; Cardone, D.; Merla, A. Improving Human–Robot Interaction by Enhancing Nao Robot Awareness of Human Facial Expression. Sensors 2021, 21, 6438. [Google Scholar] [CrossRef] [PubMed]
- Park, J.W.; Lee, H.S.; Chung, M.J. Generation of Realistic Robot Facial Expressions for Human Robot Interaction. J. Intell. Robot. Syst. 2015, 78, 443–462. [Google Scholar] [CrossRef]
- Oh, K.; Jang, M.; Kim, S.J. Automatic Emotional Expression of a Face Robot by Using a Reactive Behavior Decision Model. J. Mech. Sci. Technol. 2010, 24, 769–774. [Google Scholar] [CrossRef]
- Dou, X.; Wu, C.F.; Lin, K.C.; Gan, S.; Tseng, T.-M. Effects of Different Types of Social Robot Voices on Affective Evaluations in Different Application Fields. Int. J. Soc. Robot. 2021, 13, 615–628. [Google Scholar] [CrossRef]
- Lee, J. Generating Robotic Speech Prosody for Human Robot Interaction: A Preliminary Study. Appl. Sci. 2021, 11, 3468. [Google Scholar] [CrossRef]
- Sawabe, T.; Honda, S.; Sato, W.; Ishikura, T.; Kanbara, M.; Yoshikawa, S.; Fujimoto, Y.; Kato, H. Robot Touch with Speech Boosts Positive Emotions. Sci. Rep. 2022, 12, 6884. [Google Scholar] [CrossRef]
- Fu, C.; Deng, Q.; Shen, J.; Mahzoon, H.; Ishiguro, H. A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue Via Sharing Experience Emotionally. Sensors 2022, 22, 991. [Google Scholar] [CrossRef]
- Homma, T.; Sun, Q.; Fujioka, T.; Takawaki, R.; Ankyu, E.; Nagamatsu, K.; Sugawara, D.; Harada, E.T. Emotional Speech Synthesis for Companion Robot to Imitate Professional Caregiver Speech. arXiv 2021, arXiv:2109.12787. [Google Scholar]
- Kuo, Y.C.; Tsai, P.H. Enhancing Expressiveness of Synthesized Speech in Human-Robot Interaction: An Exploration of Voice Conversion-Based Methods. In Proceedings of the 2024 10th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 27–29 April 2024. [Google Scholar]
- Chang, Y.; Ren, Z.; Zhang, Z.; Jing, X.; Qian, K.; Shao, X.; Hu, B.; Schultz, T.; Schuller, B.W. STAA-Net: A Sparse and Transferable Adversarial Attack for Speech Emotion Recognition. arXiv 2024, arXiv:2402.01227. [Google Scholar] [CrossRef]
- Zeng, Z.; Pantic, M.; Roisman, G.I.; Huang, T.S. A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31, 39–58. [Google Scholar] [CrossRef] [PubMed]
- Torre, I.; Holk, S.; Yadollahi, E.; Leite, I.; McDonnell, R.; Harte, N. Smiling in the Face and Voice of Avatars and Robots: Evidence for a ‘smiling McGurk Effect’. IEEE Trans. Affect. Comput. 2022, 15, 393–404. [Google Scholar] [CrossRef]
- Striepe, H.; Donnermann, M.; Lein, M.; Lugrin, B. Modeling and Evaluating Emotion, Contextual Head Movement and Voices for a Social Robot Storyteller. Int. J. Soc. Robot. 2021, 13, 441–457. [Google Scholar] [CrossRef]
- McColl, D.; Nejat, G. Recognizing Emotional Body Language Displayed by a Human-Like Social Robot. Int. J. Soc. Robot. 2014, 6, 261–280. [Google Scholar] [CrossRef]
- Zhang, M.; Yu, L.; Zhang, K.; Du, B.; Zhan, B.; Chen, S.; Jiang, X.; Guo, S.; Zhao, J.; Wang, Y.; et al. Kinematic Dataset of Actors Expressing Emotions. Sci. Data 2020, 7, 292. [Google Scholar] [CrossRef] [PubMed]
- De Gelder, B. Why Bodies? Twelve Reasons for Including Bodily Expressions in Affective Neuroscience. Philos. Trans. R. Soc. B 2009, 364, 3475–3484. [Google Scholar] [CrossRef]
- Tsiourti, C.; Weiss, A.; Wac, K.; Vincze, M. Multimodal Integration of Emotional Signals from Voice, Body, and Context: Effects of (in) Congruence on Rmotion Tecognition and Attitudes Towards Robots. Int. J. Soc. Robot. 2019, 11, 555–573. [Google Scholar] [CrossRef]
- Erden, M.S. Emotional Postures for the Humanoid-Robot Nao. Int. J. Soc. Robot. 2013, 5, 441–456. [Google Scholar] [CrossRef]
- Takahashi, Y.; Kayukawa, Y.; Terada, K.; Inoue, H. Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma Game. Int. J. Soc. Robot. 2021, 13, 1777–1786. [Google Scholar] [CrossRef]
- Fu, C.; Wang, S.; Li, Z.; Gupta, A.; Meneses, A.; Ishiguro, H.; Yoshikawa, Y. Modulating Perceived Authority and Warmth of Mobile Social Robots through Bodily Openness and Vertical Movement in Gait. IEEE Robot. Autom. Let. 2024, 9, 7971–7978. [Google Scholar] [CrossRef]
- De Gelder, B.; Van den Stock, J. The Bodily Expressive Action Stimulus Test (BEAST). Construction and Validation of a Stimulus Basis for Measuring Perception of Whole Body Expression of Emotions. Front. Psychol. 2011, 2, 181. [Google Scholar] [CrossRef] [PubMed]
- De Meijer, M. The Contribution of General Features of Body Movement to the Attribution of Emotions. J. Nonverbal Behav. 1989, 13, 247–268. [Google Scholar] [CrossRef]
- De Silva, P.R.; Bianchi-Berthouze, N. Modeling Human Affective Postures: An iInformation Theoretic Characterization of Posture Features. Comput. Animat. Virtual Worlds 2004, 15, 269–276. [Google Scholar] [CrossRef]
- Halovic, S.; Kroos, C. Not All Is Noticed: Kinematic Cues of Emotion-Specific Gait. Hum. Mov. Sci. 2018, 57, 478–488. [Google Scholar] [CrossRef] [PubMed]
- Fu, C.; Alexis, M.; Yoshikawa, Y.; Ishiguro, H. Enhancing the Mobile Humanoid Robot’s Emotional Expression with Affective Vertical-Oscillations. Int. J. Soc. Robot. 2024, 16, 1523–1540. [Google Scholar] [CrossRef]
- Häring, M.; Bee, N.; André, E. Creation and Evaluation of Emotion Expression with Body Movement, Sound and Eye Color for Humanoid Robots. In Proceedings of the IEEE International Symposium on Robot & Human Interactive Communication, Venice, Italy, 6–8 September 2011. [Google Scholar]
- Tuyen, N.T.V.; Elibol, A.; Chong, N.Y. Learning Bodily Expression of Emotion for Social Robots Through Human Interaction. IEEE Trans. Cogn. Dev. Syst. 2020, 13, 16–30. [Google Scholar] [CrossRef]
- Coulson, M. Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. J. Nonverbal Behav. 2004, 28, 117–139. [Google Scholar] [CrossRef]
- Wei, Y.; Zhao, J. Designing Robot Behavior in Human Robot Interaction Based on Emotion Expression. Ind. Robot. Int. J. 2016, 43, 380–389. [Google Scholar] [CrossRef]
- Li, L.; Zhao, Z. Designing Behaviors of Robots Based on the Artificial Emotion Expression Method in Human–Robot Interactions. Machines 2023, 11, 533. [Google Scholar] [CrossRef]
- Gong, S.; Zhao, J.; Zhang, Z.; Xie, B. Task Motion Planning for Anthropomorphic Arms Based on Human Arm Movement Primitives. Ind. Robot. 2020, 47, 669–681. [Google Scholar] [CrossRef]
- Urakami, J. Do Emotional Robots Get More Help? How a Robots Emotions Affect Collaborators Willingness to Help. Int. J. Soc. Robot. 2023, 15, 1457–1471. [Google Scholar] [CrossRef]
- Hu, F.; Zhang, L.; Yang, X.; Zhang, W.-A. EEG-Based Driver Fatigue Detection Using Spatio-Temporal Fusion Network with Brain Region Partitioning Strategy. IEEE Trans. Intell. Transp. 2024, 25, 9618–9630. [Google Scholar] [CrossRef]
- Si, X.; Huang, H.; Yu, J.; Ming, D. The fNIRS-based Emotion Recognition by Spatial Transformer and WGAN Data Augmentation Towards Developing A Novel Affective BCI. IEEE Trans. Affect. Comput. 2024, 14, 1–15. [Google Scholar] [CrossRef]
- Guo, F.; Li, M.M.; Qu, Q.X. The Effect of a Humanoid Robot’s Emotional Behaviors on Users’ Emotional Responses: Evidence from Pupillometry and Electroencephalography Measures. Int. J. Hum.-Comput. Int. 2019, 35, 1947–1959. [Google Scholar] [CrossRef]
- Staffa, M.; D’Errico, L.; Sansalone, S.; Alimardani, M. Classifying Human Emotions in HRI: Applying Global Optimization Model to EEG Brain Signals. Front. Neurorobotics 2023, 17, 1191127. [Google Scholar] [CrossRef]
- Zhang, S.; Zhang, J.; Song, W.; Yang, L.; Zhao, X. Hierarchical-Attention-Based Neural Network for Gait Emotion Recognition. Phys. A Stat. Mech. Its Appl. 2024, 637, 129600. [Google Scholar] [CrossRef]
- Wang, Q.; Wang, J.; Zhu, M. Research on Emotion Recognition Algorithm Based on Improved BP Neural Network. In Proceedings of the 27th ECS Meeting, Montréal, QC, Canada, 18–22 May 2025. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Philip, S.Y. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 4–24. [Google Scholar] [CrossRef] [PubMed]
- Cai, Y.; Li, X.; Li, J. Emotion Recognition Using Different Sensors, Emotion Models, Methods and Datasets: A Comprehensive Review. Sensors 2023, 23, 2455. [Google Scholar] [CrossRef]
- Dwijayanti, S.; Iqbal, M.; Suprapto, B.Y. Real-Time Implementation of Face Recognition and Emotion Recognition in a Humanoid Robot Using a Convolutional Neural Network. IEEE Access 2022, 10, 89876–89886. [Google Scholar] [CrossRef]
- Chen, L.; Li, M.; Wu, M.; Pedrycz, W.; Hirota, K. Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human–Robot Interaction. IEEE Trans. Neural Netw. Learn. Syst. 2023, 35, 9663–9673. [Google Scholar] [CrossRef]
- Maehama, K.; Even, J.; Ishi, C.T.; Kanda, T. Enabling Robots to Distinguish Between Aggressive and Joking Attitudes. IEEE Robot. Autom. Lett. 2021, 6, 8037–8044. [Google Scholar] [CrossRef]
- Rawal, N.; Koert, D.; Turan, C.; Kersting, K.; Peters, J.; Stock-Homburg, R. Exgennet: Learning to Generate Robotic Facial Expression Using Facial Expression Recognition. Front. Robot. AI 2022, 8, 730317. [Google Scholar] [CrossRef] [PubMed]
- Khashman, A. Application of an Emotional Neural Network to Facial Recognition. Neural Comput. Appl. 2009, 18, 309–320. [Google Scholar] [CrossRef]
- Li, T.-H.S.; Kuo, P.-H.; Tsai, T.-N.; Luan, P.-C. CNN and LSTM Based Facial Expression Analysis Model for a Humanoid Robot. IEEE Access 2019, 7, 93998–94011. [Google Scholar] [CrossRef]
- Li, Y.K.; Meng, Q.H.; Wang, Y.X.; Hou, H.R. MMFN: Emotion Recognition by Fusing Touch Gesture and Facial Expression Information. Expert Syst. Appl. 2023, 228, 120469. [Google Scholar] [CrossRef]
- Daems, A.; Verfaillie, K. Viewpoint-Dependent Priming Effects in the Perception of Human Actions and Body Postures. Vis. Cogn. 1999, 6, 665–693. [Google Scholar] [CrossRef]
- Yagi, S.; Nakata, Y.; Nakamura, Y.; Ishiguro, H. Perception of Emotional Expression of Mobile Humanoid Robot Using Gait-Induced Upper Body Motion. IEEE Access 2021, 9, 124793–124804. [Google Scholar] [CrossRef]
- Mahzoon, H.; Ueda, A.; Yoshikawa, Y.; Ishiguro, H. Effect of Robot’s Vertical Body Movement on Its Perceived Emotion: A Preliminary Study on Vertical Oscillation and Transition. PLoS ONE 2022, 17, e0271789. [Google Scholar] [CrossRef] [PubMed]
- Papenmeier, F.; Uhrig, M.; Kirsch, A. Human Understanding of Robot Motion: The Role of Velocity and Orientation. Int. J. Soc. Robot. 2019, 11, 75–88. [Google Scholar] [CrossRef]
- Hirayama, T.; Okada, Y.; Kimoto, M.; Iio, T.; Shimohara, K.; Shiomi, M. Speed Effects in Touching Behaviours: Impact on Perceived Relationships in Robot-Robot Interactions. Adv. Robot. 2024, 38, 492–509. [Google Scholar] [CrossRef]
Correlation Coefficient | Happiness | Fear | Anger | Disgust | Sadness | Surprise |
---|---|---|---|---|---|---|
s | 0.97 | 0.79 | 0.876 | 0.674 | 0.811 | 0.569 |
Emotion | Video | Emotion Recognition Rate (%) | M Value | SD |
---|---|---|---|---|
Happiness | V4 | 86.27 | 5.16 | 1.63 |
V39 | 86.27 | 5.84 | 1.60 | |
V8 | 74.51 | 5.24 | 1.89 | |
V28 | 72.55 | 5.10 | 1.73 | |
V21 | 70.59 | 5.12 | 1.76 | |
V20 | 68.63 | 5.00 | 2.38 | |
V3 | 64.71 | 4.12 | 1.70 | |
V22 | 62.75 | 4.43 | 1.93 | |
V71 | 60.78 | 4.06 | 1.95 | |
V44 | 60.78 | 4.06 | 1.99 | |
V31 | 58.82 | 4.08 | 1.79 | |
V55 | 58.82 | 4.39 | 2.30 | |
V15 | 58.82 | 4.61 | 2.01 | |
V40 | 54.90 | 4.51 | 2.12 | |
V57 | 50.98 | 4.02 | 1.97 | |
V67 | 50.98 | 3.98 | 2.20 | |
V16 | 50.98 | 4.10 | 2.08 | |
Anger | V12 | 62.75 | 4.71 | 1.71 |
Sadness | V65 | 66.67 | 5.63 | 1.77 |
V60 | 62.75 | 5.39 | 1.96 | |
V59 | 54.90 | 4.61 | 2.07 | |
V61 | 52.94 | 4.65 | 1.86 | |
Surprise | V33 | 58.82 | 4.55 | 1.74 |
V50 | 52.94 | 4.76 | 1.91 | |
V49 | 52.94 | 4.63 | 1.73 |
The Motion State Set Xn | The Emotion State Set Yn | ||
---|---|---|---|
Body Part | Moving Direction | Moving Mode | |
Head | Left | Rotation | Happiness |
Torso | Right | Swing | Fear |
Right foot | Forward | Vertical | Anger |
Left foot | Backward | Extend | Disgust |
Right hand | Upward | Close up | Sadness |
Left hand | Downward | Bend | Surprise |
Inward | |||
Outward |
Video | The Encoded Input | The Encoded Output |
---|---|---|
V4 | {1, 1, 0, 1, 1, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0} | {1, 0, 0, 0, 0, 0} |
V12 | {0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0} | {0, 0, 1, 0, 0, 0} |
V33 | {0, 0, 1, 1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 1} | {0, 0, 0, 0, 0, 1} |
V65 | {0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 1, 1} | {0, 0, 0, 0, 1, 0} |
Video | Method I * | Method II * | Method III * | |||
---|---|---|---|---|---|---|
EBMR-BP | PSO-BP | BP-RMSProp | PSO-BP-RMSProp | |||
V15 | Happiness | Happiness | Happiness | Happiness | Happiness | Happiness |
V16 | Happiness | Happiness | Happiness | Sadness | Happiness | Happiness |
V40 | Happiness | Happiness | Happiness | Happiness | Happiness | Happiness |
V49 | Surprise | Happiness | Happiness | Happiness | Surprise | Happiness |
V57 | Happiness | Happiness | Happiness | Happiness | Happiness | Happiness |
V67 | Happiness | Anger | Sadness | Anger | Anger | Happiness |
B8 | Anger | Anger | Happiness | Disgust | Anger | Happiness |
B9 | Sadness | Sadness | Sadness | Happiness | Sadness | Happiness |
B14 | Fear | Happiness | Happiness | Fear | Fear | Happiness |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, W.; Jiang, T.; Zhai, W.; Zha, F. An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network. Sensors 2024, 24, 7227. https://doi.org/10.3390/s24227227
Gao W, Jiang T, Zhai W, Zha F. An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network. Sensors. 2024; 24(22):7227. https://doi.org/10.3390/s24227227
Chicago/Turabian StyleGao, Wa, Tanfeng Jiang, Wanli Zhai, and Fusheng Zha. 2024. "An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network" Sensors 24, no. 22: 7227. https://doi.org/10.3390/s24227227
APA StyleGao, W., Jiang, T., Zhai, W., & Zha, F. (2024). An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network. Sensors, 24(22), 7227. https://doi.org/10.3390/s24227227