Next Article in Journal
Observability Study on Passive Target Localization by Conic–Angle Measurements
Previous Article in Journal
Color-Ratio Maps Enhanced Optical Filter Design and Its Application in Green Pepper Segmentation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression

Department of Neurosciences, Imaging and Clinical Sciences, University G. d’Annunzio of Chieti-Pescara, 9, 66100 Chieti, Italy
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(19), 6438; https://doi.org/10.3390/s21196438
Submission received: 4 August 2021 / Revised: 20 September 2021 / Accepted: 23 September 2021 / Published: 27 September 2021
(This article belongs to the Special Issue Multi-Agent-Based Human-Computer Interaction)

Abstract

An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s.
Keywords: facial expression recognition; emotion recognition; human–robot interaction; affective computing; machine learning facial expression recognition; emotion recognition; human–robot interaction; affective computing; machine learning

Share and Cite

MDPI and ACS Style

Filippini, C.; Perpetuini, D.; Cardone, D.; Merla, A. Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression. Sensors 2021, 21, 6438. https://doi.org/10.3390/s21196438

AMA Style

Filippini C, Perpetuini D, Cardone D, Merla A. Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression. Sensors. 2021; 21(19):6438. https://doi.org/10.3390/s21196438

Chicago/Turabian Style

Filippini, Chiara, David Perpetuini, Daniela Cardone, and Arcangelo Merla. 2021. "Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression" Sensors 21, no. 19: 6438. https://doi.org/10.3390/s21196438

APA Style

Filippini, C., Perpetuini, D., Cardone, D., & Merla, A. (2021). Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression. Sensors, 21(19), 6438. https://doi.org/10.3390/s21196438

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop