A Systematic Review of Adaptivity in Human-Robot Interaction
Abstract
:1. Introduction
2. Methodology
2.1. Inclusion and Exclusion Criteria
2.2. Coding Scheme
3. State of the Art Social Adaptive Robots
3.1. Health Care and Therapy Domain
Discussion
3.2. Education
Discussion
3.3. Work Environments and Public Spaces
Discussion
3.4. At Home
Discussion
4. Future Directions and Challenges
4.1. User and Adaptations
4.2. User Emotions and Adaptation
4.3. Robot’s Memory and Adaptation
4.4. Personality and Adaptation
4.5. Robot’s Voice Adaptation
4.6. Culture and Adaptation
5. Open Issue
5.1. Understanding the Context of Adaptation
5.2. Evaluation Metrics
5.3. Ethical Concerns
6. Conclusions
Author Contributions
Conflicts of Interest
References
- WU, W.Y. Adaptive User Interface. U.S. Patent 20,100,097,331, 22 April 2010. [Google Scholar]
- Liu, J.; Wong, C.K.; Hui, K.K. An adaptive user interface based on personalized learning. IEEE Intell. Syst. 2003, 18, 52–57. [Google Scholar]
- Walsh, D.; Lin, L.C.; Dils, P.B. Context-Adaptive User Interface for a Portion of a Display Screen. U.S. Patent D678,898, 26 March 2013. [Google Scholar]
- Macbeth, S.W.; Fernandez, R.L.; Meyers, B.R.; Tan, D.S.; Robertson, G.G.; Oliver, N.M.; Murillo, O.E.; Pedersen, E.R.; Czerwinski, M.P.; Pinckney, M.D.; et al. Activity-Centric Adaptive User Interface. U.S. Patent App. 11/426,804, 27 June 2006. [Google Scholar]
- Leite, I.; Martinho, C.; Paiva, A. Social robots for long-term interaction: A survey. Int. J. Soc. Robot. 2013, 5, 291–308. [Google Scholar] [CrossRef]
- Tapus, A.; Mataric, M.J.; Scassellati, B. Socially assistive robotics [grand challenges of robotics]. IEEE Robot. Autom. Mag. 2007, 14, 35–42. [Google Scholar] [CrossRef]
- Endsley, M.R. Level of automation effects on performance, situation awareness and workload in a dynamic control task. Ergonomics 1999, 42, 462–492. [Google Scholar] [CrossRef] [PubMed]
- Beer, J.; Fisk, A.D.; Rogers, W.A. Toward a framework for levels of robot autonomy in human-robot interaction. J. Hum.-Robot Interact. 2014, 3, 74. [Google Scholar] [CrossRef]
- Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef]
- Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.J. A review of the applicability of robots in education. J. Technol. Educ. Learn. 2013, 1. 209–0015. [Google Scholar] [CrossRef]
- Robinson, H.; MacDonald, B.; Broadbent, E. The role of healthcare robots for older people at home: A review. Int. J. Soc. Robot. 2014, 6, 575–591. [Google Scholar] [CrossRef]
- Saerbeck, M.; Schut, T.; Bartneck, C.; Janse, M.D. Expressive robots in education: Varying the degree of social supportive behavior of a robotic tutor. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM 2010), Atlanta, GA, USA, 10–15 April 2010; pp. 1613–1622. [Google Scholar]
- Bruce, A.; Nourbakhsh, I.; Simmons, R. The role of expressiveness and attention in human-robot interaction. Robotics and Automation. In Proceedings of the 2002 IEEE International Conference on Communications, New York, NY, USA, 28 April–2 May 2002; Volume 4, pp. 4138–4142. [Google Scholar]
- Bartneck, C.; Kulić, D.; Croft, E.; Zoghbi, S. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef]
- Fortunati, L.; Esposito, A.; Lugano, G. Introduction to the special issue “Beyond industrial robotics: Social robots entering public and domestic spheres”. Inf. Soc. Beyond Ind. Robot. Soc. Robot. Enter. 2015, 31, 229–236. [Google Scholar] [CrossRef]
- Komatsubara, T.; Shiomi, M.; Kanda, T.; Ishiguro, H.; Hagita, N. Can a social robot help children’s understanding of science in classrooms? In Proceedings of the Second International Conference on Human-Agent Interaction (ACM 2014), Tsukuba, Japan, 28–31 October 2014; pp. 83–90. [Google Scholar]
- Jimenez, F.; Yoshikawa, T.; Furuhashi, T.; Kanoh, M. An emotional expression model for educational-support robots. J. Artif. Intell. Soft Comput. Res. 2015, 5, 51–57. [Google Scholar] [CrossRef]
- Benitti, F.B.V. Exploring the educational potential of robotics in schools: A systematic review. Comput. Educ. 2012, 58, 978–988. [Google Scholar] [CrossRef]
- Meho, L.I.; Yang, K. Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. J. Assoc. Inf. Sci. Technol. 2007, 58, 2105–2125. [Google Scholar] [CrossRef]
- Scholar, G. Top publications-Robotics. Available online: https://scholar.google.es/citations (accessed on 21 April 2017).
- François, D.; Polani, D.; Dautenhahn, K. Towards socially adaptive robots: A novel method for real time recognition of human-robot interaction styles. In Proceedings of the 8th IEEE IEEE-RAS International Conference on Humanoid Robots, Daejeon, Korea, 1–3 December 2008; pp. 353–359. [Google Scholar]
- Tapus, A.; Ţăpuş, C.; Matarić, M.J. User—Robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell. Serv. Robot. 2008, 1, 169. [Google Scholar] [CrossRef]
- Robins, B.; Dautenhahn, K.; Te Boekhorst, R.; Nehaniv, C.L. Behaviour delay and robot expressiveness in child-robot interactions: A user study on interaction kinesics. In Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (ACM 2008), Amsterdam, The Netherlands, 12–15 March 2008; pp. 17–24. [Google Scholar]
- Tapus, A. Improving the quality of life of people with dementia through the use of socially assistive robots. In Proceedings of the Advanced Technologies for Enhanced Quality of Life, Iasi, Romania, 22–26 July 2009; pp. 81–86. [Google Scholar]
- Boccanfuso, L.; O’Kane, J.M. CHARLIE: An adaptive robot design with hand and face tracking for use in autism therapy. Int. J. Soc. Robot. 2011, 3, 337–347. [Google Scholar] [CrossRef]
- McColl, D.; Nejat, G. Meal-time with a socially assistive robot and older adults at a long-term care facility. J. Hum.-Robot Interact. 2013, 2, 152–171. [Google Scholar] [CrossRef]
- Wainer, J.; Dautenhahn, K.; Robins, B.; Amirabdollahian, F. A pilot study with a novel setup for collaborative play of the humanoid robot KASPAR with children with autism. Int. J. Soc. Robot. 2014, 6, 45–65. [Google Scholar] [CrossRef] [Green Version]
- Stafford, R.Q.; MacDonald, B.A.; Jayawardena, C.; Wegner, D.M.; Broadbent, E. Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. Int. J. Soc. Robot. 2014, 6, 17–32. [Google Scholar] [CrossRef]
- Coninx, A.; Baxter, P.; Oleari, E.; Bellini, S.; Bierman, B.; Henkemans, O.B.; Cañamero, L.; Cosi, P.; Enescu, V.; Espinoza, R.R.; et al. Towards long-term social child-robot interaction: Using multi-activity switching to engage young users. J. Hum.-Robot Interact. 2016, 5, 32–67. [Google Scholar] [CrossRef] [Green Version]
- Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of healthcare robots for the older population: Review and future directions. Int. J. Soc. Robot. 2009, 1, 319–330. [Google Scholar] [CrossRef]
- Williams, A.B. The qualitative impact of using LEGO MINDSTORMS robots to teach computer engineering. IEEE Trans. Educ. 2003, 46, 206. [Google Scholar] [CrossRef]
- Serholt, S.; Barendregt, W.; Leite, I.; Hastie, H.; Jones, A.; Paiva, A.; Vasalou, A.; Castellano, G. Teachers’ views on the use of empathic robotic tutors in the classroom. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 955–960. [Google Scholar]
- Ahmad, M.I.; Mubin, O.; Orlando, J. Understanding behaviours and roles for social and adaptive robots in education: teacher’s perspective. In Proceedings of the Fourth International Conference on Human Agent Interaction, Singapore, 4–7 October 2016; pp. 297–304. [Google Scholar]
- Salter, T.; Michaud, F.; Létourneau, D.; Lee, D.; Werry, I.P. Using proprioceptive sensors for categorizing human-robot interactions. In Proceedings of the 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI), Washington, DC, USA, 9–11 March 2007; pp. 105–112. [Google Scholar]
- Gonzalez-Pacheco, V.; Ramey, A.; Alonso-Martín, F.; Castro-Gonzalez, A.; Salichs, M.A. Maggie: A social robot as a gaming platform. Int. J. Soc. Robot. 2011, 3, 371–381. [Google Scholar] [CrossRef] [Green Version]
- Janssen, J.B.; van der Wal, C.C.; Neerincx, M.A.; Looije, R. Motivating children to learn arithmetic with an adaptive robot game. In Proceedings of the International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011; pp. 153–162. [Google Scholar]
- Szafir, D.; Mutlu, B. Pay attention!: Designing adaptive agents that monitor and improve user engagement. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM 2012), Austin, TX, USA, 5–10 May 2012; pp. 11–20. [Google Scholar]
- Kühnlenz, B.; Sosnowski, S.; Buß, M.; Wollherr, D.; Kühnlenz, K.; Buss, M. Increasing helpfulness towards a robot by emotional adaption to the user. Int. J. Soc. Robot. 2013, 5, 457–476. [Google Scholar] [CrossRef]
- Brown, L.; Kerwin, R.; Howard, A.M. Applying behavioral strategies for student engagement using a robotic educational agent. In Proceedings of the 2013 IEEE international conference on Systems, Man, and Cybernetics (SMC 2013), Manchester, UK, 13–16 October 2013; pp. 4360–4365. [Google Scholar]
- Ros, R.; Baroni, I.; Demiris, Y. Adaptive human-robot interaction in sensorimotor task instruction: From human to robot dance tutors. Robot. Auton. Syst. 2014, 62, 707–720. [Google Scholar] [CrossRef]
- Leite, I.; Castellano, G.; Pereira, A.; Martinho, C.; Paiva, A. Empathic robots for long-term interaction. Int. J. Soc. Robot. 2014, 6, 329–341. [Google Scholar] [CrossRef]
- Uluer, P.; Akalın, N.; Köse, H. A new robotic platform for sign language tutoring. Int. J. Soc. Robot. 2015, 7, 571–585. [Google Scholar] [CrossRef]
- De Greeff, J.; Belpaeme, T. Why robots should be social: Enhancing machine learning through social human-robot interaction. PLoS ONE 2015, 10, e0138061. [Google Scholar] [CrossRef] [PubMed]
- Obaid, M.; Barendregt, W.; Alves-Oliveira, P.; Paiva, A.; Fjeld, M. Designing Robotic Teaching Assistants: Interaction Design Students’ and Children’s Views. In Proceedings of the International Conference on Social Robotics, Paris, France, 26–30 October 2015; pp. 502–511. [Google Scholar]
- Kanda, T.; Shiomi, M.; Miyashita, Z.; Ishiguro, H.; Hagita, N. An affective guide robot in a shopping mall. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), San Diego, CA, USA, 9–13 March 2009; pp. 173–180. [Google Scholar]
- Asif, M.; Sabeel, M.; Mujeeb-ur Rahman, K.Z. Waiter robot-solution to restaurant automation. In Proceedings of the 1st student multi disciplinary research conference (MDSRC), At Wah, Pakistan, 14–15 November 2015. [Google Scholar]
- Hoffman, G.; Breazeal, C. Effects of anticipatory action on human-robot teamwork efficiency, fluency, and perception of team. In Proceedings of the ACM/IEEE international conference on Human-robot interaction, Washington, DC, USA, 9–11 March 2007; pp. 1–8. [Google Scholar]
- Svenstrup, M.; Bak, T.; Maler, O.; Andersen, H.J.; Jensen, O.B. Pilot study of person robot interaction in a public transit space. In Proceedings of the International Conference on Research and Education in Robotics, Heidelberg, Germany, 22–24 May 2008; pp. 96–106. [Google Scholar]
- Lee, M.K.; Forlizzi, J. Designing adaptive robotic services. In Proceedings of the International Association of Societies of Design Research 2009, Seoul, Korea, 18–22 October 2009. [Google Scholar]
- Lee, M.K.; Kiesler, S.; Forlizzi, J.; Rybski, P. Ripple effects of an embedded social agent: A field study of a social robot in the workplace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM 2012), Austin, TX, USA, 5–10 May 2012; pp. 695–704. [Google Scholar]
- Kanda, T.; Shiomi, M.; Miyashita, Z.; Ishiguro, H.; Hagita, N. A communication robot in a shopping mall. IEEE Trans. Robot. 2010, 26, 897–913. [Google Scholar] [CrossRef]
- Shiomi, M.; Shinozawa, K.; Nakagawa, Y.; Miyashita, T.; Sakamoto, T.; Terakubo, T.; Ishiguro, H.; Hagita, N. Recommendation effects of a social robot for advertisement-use context in a shopping mall. Int. J. Soc. Robot. 2013, 5, 251–262. [Google Scholar] [CrossRef]
- Sekmen, A.; Challa, P. Assessment of adaptive human-robot interactions. Knowl.-Based Syst. 2013, 42, 49–59. [Google Scholar] [CrossRef]
- Rousseau, V.; Ferland, F.; Létourneau, D.; Michaud, F. Sorry to interrupt, but may I have your attention? Preliminary design and evaluation of autonomous engagement in HRI. J. Hum.-Robot Interact. 2013, 2, 41–61. [Google Scholar] [CrossRef]
- Aly, A.; Tapus, A. A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction, Tokyo, Japan, 3–6 March 2013; IEEE Press: Piscataway, NJ, USA, 2013; pp. 325–332. [Google Scholar]
- Keizer, S.; Ellen Foster, M.; Wang, Z.; Lemon, O. Machine Learning for Social Multiparty Human-Robot Interaction. ACM Trans. Interact. Intell. Syst. 2014, 4, 14. [Google Scholar] [CrossRef]
- Shiomi, M.; Iio, T.; Kamei, K.; Sharma, C.; Hagita, N. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan. PLoS ONE 2015, 10, e0128031. [Google Scholar] [CrossRef] [PubMed]
- Kato, Y.; Kanda, T.; Ishiguro, H. May i help you?: Design of human-like polite approaching behavior. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (ACM 2015), Portland, OR, USA, 2–5 March 2015; pp. 35–42. [Google Scholar]
- Dang, T.H.H.; Tapus, A. Stress Game: The Role of Motivational Robotic Assistance in Reducing User’s Task Stress. Int. J. Soc. Robot. 2015, 7, 227–240. [Google Scholar] [CrossRef]
- Liu, P.; Glas, D.F.; Kanda, T.; Ishiguro, H.; Hagita, N. A Model for Generating Socially-Appropriate Deictic Behaviors Towards People. Int. J. Soc. Robot. 2017, 9, 33–49. [Google Scholar] [CrossRef]
- Sisbot, E.A.; Alami, R.; Siméon, T.; Dautenhahn, K.; Walters, M.; Woods, S. Navigation in the presence of humans. In Proceedings of the 2005 5th IEEE-RAS International Conference on Humanoid Robots, Tsukuba, Japan, 5–7 December 2005; pp. 181–188. [Google Scholar] [Green Version]
- Forlizzi, J.; DiSalvo, C. Service robots in the domestic environment: A study of the roomba vacuum in the home. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction (ACM 2006), Salt Lake City, UT, USA, 2–3 March 2006; pp. 258–265. [Google Scholar]
- Fink, J.; Mubin, O.; Kaplan, F.; Dillenbourg, P. Roomba is not a Robot; AIBO is still Alive! Anthropomorphic Language in Online Forums. In Proceedings of the 3rd International Conference on Social Robotics (ICSR 2011), Amsterdam, The Netherlands, 23–25 March 2011. [Google Scholar]
- Torrey, C.; Powers, A.; Marge, M.; Fussell, S.R.; Kiesler, S. Effects of adaptive robot dialogue on information exchange and social relations. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction (ACM 2006), Salt Lake City, UT, USA, 2–3 March 2006; pp. 126–133. [Google Scholar]
- Torrey, C.; Powers, A.; Fussell, S.R.; Kiesler, S. Exploring adaptive dialogue based on a robot’s awareness of human gaze and task progress. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (ACM 2007), Washington, DC, USA, 9–11 March 2007; pp. 247–254. [Google Scholar]
- Gross, H.M.; Schroeter, C.; Mueller, S.; Volkhardt, M.; Einhorn, E.; Bley, A.; Martin, C.; Langner, T.; Merten, M. Progress in developing a socially assistive mobile home robot companion for the elderly with mild cognitive impairment. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, CA, USA, 25–30 September 2011; pp. 2430–2437. [Google Scholar]
- Gross, H.M.; Mueller, S.; Schroeter, C.; Volkhardt, M.; Scheidig, A.; Debes, K.; Richter, K.; Doering, N. Robot companion for domestic health assistance: Implementation, test and case study under everyday conditions in private apartments. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5992–5999. [Google Scholar]
- Cooney, M.; Kanda, T.; Alissandrakis, A.; Ishiguro, H. Designing enjoyable motion-based play interactions with a small humanoid robot. Int. J. Soc. Robot. 2014, 6, 173–193. [Google Scholar] [CrossRef]
- Youssef, K.; Asano, T.; De Silva, P.R.S.; Okada, M. Sociable Dining Table: Meaning Acquisition Exploration in Knock-Based Proto-Communication. Int. J. Soc. Robot. 2016, 8, 67–84. [Google Scholar] [CrossRef]
- Cameron, D.; Fernando, S.; Collins, E.; Millings, A.; Moore, R.; Sharkey, A.; Evers, V.; Prescott, T. Presence of life-like robot expressions influences children’s enjoyment of human-robot interactions in the field. In Proceedings of the AISB Convention 2015: The Society for the Study of Artificial Intelligence and Simulation of Behaviour, Canterbury, UK, 20–22 April 2015. [Google Scholar]
- Andersen, P.A.; Guerrero, L.K. Principles of communication and emotion in social interaction. In Handbook of Communication and Emotion: Research, Theory, Applications, and Contexts; Academic Press: Cambridge, MA, USA, 1998; pp. 49–96. [Google Scholar]
- Belpaeme, T.; Baxter, P.E.; Read, R.; Wood, R.; Cuayáhuitl, H.; Kiefer, B.; Racioppa, S.; Kruijff-Korbayová, I.; Athanasopoulos, G.; Enescu, V.; et al. Multimodal child-robot interaction: Building social bonds. J. Hum.-Robot. Interact. 2012, 1, 33–53. [Google Scholar] [CrossRef] [Green Version]
- Ahmad, M.I.; Mubin, O.; Orlando, J. Adaptive Social Robot for Sustaining Social Engagement during Long-Term Children-Robot Interaction. Int. J. Hum.-Comput. Interact. 20. [CrossRef]
- Cuadrado, L.E.I.; Riesco, Á.M.; López, F.D.L.P. ARTIE: An Integrated Environment for the Development of Affective Robot Tutors. Front. Comput. Neurosci. 2016, 10, 77. [Google Scholar]
- Baxter, P. Memory-Centred Cognitive Architectures for Robots Interacting Socially with Humans. arXiv 2016, arXiv:1602.05638. [Google Scholar]
- Smith, S.M.; Petty, R.E. Personality moderators of mood congruency effects on cognition: The role of self-esteem and negative mood regulation. J. Pers. Soc. Psychol. 1995, 68, 1092. [Google Scholar] [CrossRef] [PubMed]
- Lubold, N.; Walker, E.; Pon-Barry, H. Effects of voice-adaptation and social dialogue on perceptions of a robotic learning companion. In Proceedings of the The Eleventh ACM/IEEE International Conference on Human Robot Interaction, Christchurch, New Zealand, 7–10 March 2016. [Google Scholar]
- Shahid, S.; Krahmer, E.; Swerts, M. Alone or together: Exploring the effect of physical co-presence on the emotional expressions of game playing children across cultures. Fun Games 2008, 5294, 94–105. [Google Scholar]
- Rau, P.P.; Li, Y.; Li, D. Effects of communication style and culture on ability to accept recommendations from robots. Comput. Hum. Behav. 2009, 25, 587–595. [Google Scholar] [CrossRef]
- Ahmad, M.I.; Mubin, O.; Orlando, J. Effect of Different Adaptations by a Robot on Children’s Long-term Engagement: An Exploratory Study. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology (ACM 2016), Osaka, Japan, 9–12 November 2016; p. 31. [Google Scholar]
References | Robot | Study Design | Robot Capabilities | Adaptive Features |
---|---|---|---|---|
Francois et al., 2008 | Aibo Anthro: Yes | Subjects: 5 autistic children No. of Interactions: 1 Interaction Type: autonomous Measures: interaction style recognition Method: video analysis | provides feedback after detecting the interaction style | user-interaction style based dialogue adaptation |
Tapus et al., 2008 | ActiveMedia Pioneer 2-DX mobile robot anthropomorphic: No | Subjects: 19 adults No. of Interactions: One-off Interaction Type: autonomous Conditions: (robot vs. Human) Measures: task performance Method: Video analysis | updates dialogue, and human user movement | User’s task performance-based adaptations |
Robins et al., 2008 | KASPER anthropomorphic: Yes | Subjects: 18 children No. of Interactions: one-off Interaction Type: WoZ controlled robot Measures: (1) Duration of the turn taking pause between a child and robot during both games. (2) Child imitating reactions duration. Method: video analysis | plays the game and display gestures and facial expressions. | game-based adaptations |
Tapus et al., 2009 | Torso anthropomorphic: Yes | Subjects: 9 adults No. of Interactions: once every week for six months Interaction Type: autonomous Conditions: (robot vs. Human) Measures: task performance Method: Video analysis | Display Gestures, Facial expressions, and Utters Speech | User’s personality-based adaptations |
Boccanfuso et al., 2011 | CHARLIE anthropomorphic: Yes | Subjects: 3 children with grade I diabetics No. of Interactions: one-off Interaction Type: autonomous Measures: Speed and accuracy of robot’s hand and face detection abilities Method: Video analysis | playing an imitation game and perform face tracking | Detecting faces and imitating hand movements |
McColl et al., 2013 | Brian anthropomorphic: Yes | Subjects: 8 elderly people No. of Interactions: One-off Interaction Type: autonomous Measures: user engagement Method: Questionnaires | updates dialogue and gesture based on the users eating behaviour | Dialogue and gesture-based adaptations |
Wainer et al., 2013 | KASPER anthropomorphic: Yes | Subjects: 6 autistic children No. of Interactions: One-off Interaction Type: autonomous Conditions: (robot vs. Human) Measures: Enjoyment, collaboration Method: Video analysis | Display Gestures, Facial expressions, and Utters Speech | Game event-based adaptations |
Stafford et al., 2013 | CHARLIE anthropomorphic: Yes | Subjects: 25 Elderly No. of Interactions: 3 interactions/child between one and two months Interaction Type: Autonomous Measures: User experience Method: Questionnaires | Speech generation, face recognition, understanding touch sensors, navigation to users room | User profiling (scheduling visits, reminding medications, blood pressure measurements) |
Coninx et al., 2016 | NAO anthropomorphic: Yes | Subjects: 3 children with grade I diabetics No. of Interactions: 3 interactions/child between one and two months Interaction Type: autonomous with WoZ controlled speech Measures: User experience Method: Questionnaires and logged Data | Switching between activities, display gestures, dances, and Utters Speech | User profiling (name, age, performance, preferences) User emotions detection Memory adaptations |
References | Robot | Study Design | Robot Capabilities | Adaptive Features |
---|---|---|---|---|
Salter et al., 2007 | ROBALL anthropomorphic: No | Subjects: 12 children No. of Interactions: one-off Interaction Type: Autonomous Measures: Accelerometers, Tilt sensors. Method: video analysis | moving and avoiding abstavles | user-playing patterns based adaptation |
Janssen et al., 2011 | NAO anthropomorphic: Yes | Subjects: 20 children Conditions: between-subject (Personalized vs. Group level versions) No. of Interactions: 3 Interaction Type: semi-autonomous Measures: motivation Method: Questionnaires | generate context-aware dialogue during game. | game-event-based adaptation |
Szafir et al., 2012 | Virtual Agent | Subjects: 30 children - 10 per group Conditions: between-subject (low immediacy vs. random immediacy vs. adaptive behaviours ) No. of Interactions: one-off Interaction Type: Autonomous Measures: user attention and task performance Method: Questionnaires | updates dialogue, controls voice and displays gestures. | user-interaction-engagement based adaptation |
Brown et al., 2013 | DARWIN anthropomorphic: Yes | Subjects: 24 children Conditions: between-subject (verbally interactive robot vs. non-verbally interactive robot vs. verbally and non-verbally interactive robot vs. no-robot) No. of Interactions: one-off Interaction Type: semi-autonomous Measures: user engagement Method: Questionnaires | displayed supportive dialogue, gestures during the game. | game-event-based adaptation |
Kuehnlenz et al., 2013 | EDDIE anthropomorphic: Yes | Subjects: 84 adults Condition: Full Emotion Adaption vs. Implicit Adaption, Explicit Adaption vs. or no adaption No. of Interactions: one-off Interaction Type: autonomous Measures: robot’s helpfulness Method: Questionnaires | displays emotional verbal and facial expressions | emotion-based adaptations |
Ros et al., 2014 | NAO anthropomorphic: Yes | Subjects: 12 children No. of Interactions: 18 Interaction Type: Autonomous Measures: social engagement Method: video analysis, Questionnaires | updates both verbal (text-to-speech) and non-verbal (LED’s, head, poses, and dance moves) | user-profiling, memory based adaptation |
Liete et al., 2014 | iCAT anthropomorphic: Yes | Subjects: 16 children No. of Interactions: 5 Interaction Type: Autonomous Measures: engagement, perceived support and social co-presence Method: Questionnaires | Emphatic dialogue, facial expressions | user profiling (name, performance), user emotions-based Adaptation |
Uluer et al., 2015 | ROBOVIE anthropomorphic: Yes | Subjects: 3 groups (18 graduate students, 6 children with typical hearing, 18 hearing impaired children) No. of Interactions: One-off Interaction Type: semi-autonomous Measures: learning performance Method: Video analysis | playing game, display Gestures, LED lights and Utters Speech | Gesture specific, LED’s specific and speech/sound adaptations |
Greeff et al., 2015 | LightHead anthropomorphic: Yes | Subjects: 41 adults Conditions: social vs. non-social No. of Interactions: one-off Interaction Type: semi-autonomous Measures: Robots learning performance and gaze behaviour Method: Questionnaires and video analysis | playing turn taking language game with a human teacher in order to learn words. | gaze-based adaptation, user-performance based facial and verbal expressions |
References | Robot | Study Design | Robot Capabilities | Adaptive Features |
---|---|---|---|---|
Hoffman et al., 2007 | Virtual Agent | Subjects: 32 adults No. of Interactions: one-off Interaction Type: autonomous with WoZ controlled speech Measures: Time taken to complete a task. Method: logged Data | attach car parts through anticipating actions | User and game based adaptations |
Svenstrup et al., 2008 | FESTO anthropomorphic: Yes | Subjects: 48 adults No. of Interactions: one-off Interaction Type: Autonomous Measures: user experience Method: Questionnaires and interviews | detecting individual, playing music and showing expressions | user-identification based adaptation. |
Lee et al., 2009 | SnackBot anthropomorphic: Yes | Subjects: 21 adults No. of Interactions: four-months field study Interaction Type: Autonomous with WoZ controlled speech recognition Measures: user experience Method: Questionnaires and interviews | utters context-aware speech, and deliver food through navigating inside the hall | user-preference based adaptation. |
Kanda et al., 2010 | Robovie-IIF anthropomorphic: Yes | Subjects: 235 adults No. of Interactions: 25-days field trial Interaction Type: Autonomous with WoZ controlled speech recognition Measures: enjoyment and social interaction, visitor’s perception Method: Questionnaires and video analysis | identify users, providing shopping information, route guidance, inquire personal information | user and memory-based adaptation. |
Shiomi et al., 2013 | Robovie-II Robovie-miniR2 anthropomorphic: Yes | Subjects: 256 (only interacting users) Conditions: GUI, Robovie-miniR2, and Robovie-II. No. of Interactions: field-trial Interaction Type: Autonomous with WoZ controlled speech recognition Measures: Robots learning performance and gaze behaviour Method: Questionnaires and video analysis | utters speech, gestures, and non-verbal behaviours according to person’s action. | User specific gesture, dialogue based adaptations |
Sekmen et al., 2013 | Pioneer 3-AT mobile robot anthropomorphic: No | Subjects: 25 adults Conditions: adaptive vs. non-adaptive. No. of Interactions: one-off Interaction Type: Semi-autonomous Measures: user preference Method: Questionnaires | detecting and recognising face, and speech, understanding natural language, filtering information from Internet and navigating through following the map | Speech and user-based adaptation. |
Rousseau et al., 2013 | Robovie anthropomorphic: Yes | Subjects: 381 visitors No. of Interactions: field-trail Interaction Type: Autonomous Measures: user preference on robot’s behaviours Method: video analysis | detecting users, navigating toward the user, facial expressions, head movements, and arm gestures | user-identification based adaptation. |
Aly et al., 2013 | NAO anthropomorphic: Yes | Subjects: 35 children No. of Interactions: one-off Interaction Type: WoZ controlled robot Measures: engagement Method: Questionnaires | utters speech and display gestures | personality-based adaptations |
Keizer et al., 2014 | iCAT anthropomorphic: Yes | Subjects: 37 adults Conditions: Rule-based Social Skill Recognizer (SSR) vs. trained SSR. No. of Interactions: one-off Interaction Type: Autonomous Measures: detecting customer, detection time, system response time, drink-serving time, number of engagement changes. Method: Questionnaires and video analysis | detecting customer, track multiple customers, serve drinks and take orders. | Speech and user-based adaptation. |
Kato et al., 2015 | Robovie anthropomorphic: Yes | Subjects: 26 visitors Conditions: intention estimation algorithm vs. proactive vs. non-adaptive No. of Interactions: field trail Interaction Type: Autonomous with WoZ controlled speech recognition Measures: interaction intention success rate, Method: video analysis | detecting visitors and interacting with them. | user-interaction-intention based adaptation. |
Shiomi et al., 2015 | NEO-PR45 anthropomorphic: Yes | Subjects: 28 adults Conditions: simple-robot vs. adaptive robot vs. human-caregiver. No. of Interactions: one-off Interaction Type: Autonomous Measures: ease of user, enjoyment, degree of comfort Method: Questionnaires and interviews | provides speech based feedback and adapts speed based on user preference. | Speed and speech-based adaptation. |
Dang et al., 2015 | NAO | Subjects: 17 adults Conditions: Robot vs. no-robot in normal alarm system and stressful alarm system No. of Interactions: one-off Interaction Type: semi-autonomous Measures: interaction preference Method: video analysis | generate true or false alarms through speech and gestures | user-stress-level and personality based adaptations |
Liu et al., 2016 | Robovie anthropomorphic: Yes | Subjects: 33 adults No. of Interactions: one-off Interaction Type: WoZ controlled robot Measures: naturalness, understandability, perceived politeness and overall goodness of the robot’s deictic behaviours Method: Questionnaire | displays deictic behaviours through gaze, casual and precise pointing | user pointing behaviours based adaptations |
References | Robot | Study Design | Robot Capabilities | Adaptive Features |
---|---|---|---|---|
Torrey et al., 2006 | Pearl anthropomorphic: Yes | Subjects: 49 adults No. of Interactions: One-off Interaction Type: autonomous Measures: Information exchange and social relations. Method: Questionnaire and Video analysis | utters context aware speech | Dialogue-based adaptations |
Torrey et al., 2007 | Pearl anthropomorphic: Yes | Subjects: 66 adults Conditions: “Question Only”, “Gaze Added”, “Delay Added”, “Immediate Added” No. of Interactions: one-off Interaction Type: autonomous Measures: Performance, communication and subjective evaluation Method: Video analysis, Questionnaires and Interviews | utters context-aware speech with gaze movement | Dialogue-based Adaptations |
Cooney et al., 2014 | SPONGE anthropomorphic: No | Subjects: 20 adults (within-subject) Conditions: (Naïve design vs. proposed design) and (reward, vs. suggestion) No. of Interactions: One-off Interaction Type: autonomous Measures: total interacting users, total printed coupons, and interaction initiation. Method: video analysis | provides rewards and suggestions based on understanding human gestures | Dialogue Adaptation Based on Human Gestures |
Gross et al., 2015 | CompanionAble ’Max’ anthropomorphic: Yes | Subjects: 9 elderly No. of Interactions: trial for three days Interaction Type: semi-autonomous Measures: technology acceptance Method: Interviews | display emotions, recognise, detect and track person, give recommendations, understand haptic input | User-preference and emotion based adaptations |
Youssef et al., 2015 | DISH anthropomorphic: No | Subjects: 3 groups (18 graduate students, 6 children with typical hearing, 18 hearing impaired children) No. of Interactions: One-off Interaction Type: semi-autonomous Measures: learning performance Method: Video analysis | understanding knocking behaviour and moving on the table. | User-knocking behaviour based adaptations |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ahmad, M.; Mubin, O.; Orlando, J. A Systematic Review of Adaptivity in Human-Robot Interaction. Multimodal Technol. Interact. 2017, 1, 14. https://doi.org/10.3390/mti1030014
Ahmad M, Mubin O, Orlando J. A Systematic Review of Adaptivity in Human-Robot Interaction. Multimodal Technologies and Interaction. 2017; 1(3):14. https://doi.org/10.3390/mti1030014
Chicago/Turabian StyleAhmad, Muneeb, Omar Mubin, and Joanne Orlando. 2017. "A Systematic Review of Adaptivity in Human-Robot Interaction" Multimodal Technologies and Interaction 1, no. 3: 14. https://doi.org/10.3390/mti1030014
APA StyleAhmad, M., Mubin, O., & Orlando, J. (2017). A Systematic Review of Adaptivity in Human-Robot Interaction. Multimodal Technologies and Interaction, 1(3), 14. https://doi.org/10.3390/mti1030014