Methods of Generating Emotional Movements and Methods of Transmitting Behavioral Intentions: A Perspective on Human-Coexistence Robots
Abstract
:1. Introduction
2. Generation of Emotional Movements of Robots
2.1. Application of Laban Movement Analysis
2.2. Development from the Circumplex Model of Affect
2.3. Imitation of Human Movement
2.4. Behavior of Non-Biomimetic Robot
2.5. Discussion
3. Transmission of the Behavioral Intention of Robots
3.1. Related Matters
3.1.1. Attention to Preparatory Motion and the Preliminary Motion of Humans
3.1.2. Human Understanding of Behavioral Intentions of Robots
3.1.3. Modeling and Prediction of Human Behavior
3.1.4. Information Transmission by Projection
3.2. Informative Motion in Manipulation
3.2.1. Design of the Reaching Motion of Robots
3.2.2. Design of the Handover Motion of Robots
- Adding information about an object;
- 2.
- Position accuracy, working speed, and synchronization;
- 3.
- Object grasping;
- 4.
- Handover position;
- 5.
- How to release.
3.2.3. Design of the Throwing Motion of Robots
- Adding information about object;
- 2.
- Generation of the throwing motion;
- 3.
- Learning the throwing motion.
3.3. Transmission of the Movement Intention of Robots
3.3.1. Transmission of Movement Intention by a Lamp
3.3.2. Transmission of Movement Intention by Gaze (Line of Sight)
3.3.3. Transmission of Movement Intention by an Arrow
3.3.4. Announcement of Route and Area of Mobile Robots
3.3.5. Information Transmission Using MR/AR Technology
3.4. Display of Recognition and Intention of Self-Driving Cars
3.4.1. Information Transmission by LED Strings and Signs
3.4.2. Information Transmission by Projection
3.4.3. Comparative Experiment of Displays about Recognition and Intention
3.5. Discussion
4. Remarks
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Karg, M.; Samadani, A.-A.; Gorbet, R.; Kuhnlenz, K.; Hoey, J.; Kulic, D. Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation. IEEE Trans. Affect. Comput. 2013, 4, 341–359. [Google Scholar] [CrossRef]
- Venture, G.; Kulic, D. Robot Expressive Motions: A Survey of Generation and Evaluation Methods. ACM Trans. Hum.-Robot. Interact. 2019, 8, 1–17. [Google Scholar] [CrossRef] [Green Version]
- McColl, D.; Hong, A.; Hatakeyama, N.; Nejat, G.; Benhabib, B. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. J. Intell. Robot. Syst. 2016, 82, 101–133. [Google Scholar] [CrossRef]
- Saunderson, S.; Nejat, G. How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human-Robot Interaction. Int. J. Soc. Robot. 2019, 11, 575–608. [Google Scholar] [CrossRef]
- Bartenieff, I.; Lewis, D. Body Movement: Coping with the Environment; Gordon and Breach Science Publishers: New York, NY, USA, 1980; pp. 1–304. ISBN 0677055005. [Google Scholar]
- Hodgson, J. Mastering Movement: The Life and Work of Rudolf Laban; Routledge: New York, NY, USA, 2001; pp. 1–352. ISBN 9780878300808. [Google Scholar]
- Newlove, J.; Dalby, J. Laban for All; Routledge: New York, NY, USA, 2004; pp. 1–256. ISBN 9780878301805. [Google Scholar]
- Nakata, T.; Sato, T.; Mizoguchi, H.; Mori, T. Synthesis of robot-to-human expressive behavior for human-robot symbiosis. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’96), Osaka, Japan, 4–8 November 1996; Volume 3, pp. 1608–1613. [Google Scholar] [CrossRef]
- Nakata, T.; Sato, T.; Mori, T. Expression of Emotion and Intention by Robot Body Movement. In Intelligent Autonomous Systems IAS-5; Kakazu, Y., Wada, M., Sato, T., Eds.; IOS Press: Amsterdam, The Netherlands, 1998; pp. 352–359. ISBN 978-90-5199-398-1. [Google Scholar]
- Nakata, T.; Mori, T.; Sato, T. Analysis of Impression of Robot Bodily Expression. J. Robot. Mechatron. 2002, 14, 27–36. [Google Scholar] [CrossRef]
- Chi, D.; Costa, M.; Zhao, L.; Badler, N. The EMOTE Model for Effort and Shape. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’00), New Orleans, LA, USA, 23–28 July 2000; pp. 173–182. [Google Scholar] [CrossRef] [Green Version]
- Hachimura, K.; Takashina, K.; Yoshimura, M. Analysis and evaluation of dancing movement based on LMA. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication 2005 (ROMAN 2005), Nashville, TN, USA, 13–15 August 2005; pp. 294–299. [Google Scholar] [CrossRef]
- Barakova, E.I.; Lourens, T. Expressing and interpreting emotional movements in social games with robots. Pers. Ubiquitous Comput. 2010, 14, 457–467. [Google Scholar] [CrossRef] [Green Version]
- Rett, J.; Dias, D. Computational Laban Movement Analysis Using Probability Calculus; University of Coimbra: Coimbra, Portugal, 2007; pp. 1–8. [Google Scholar]
- Matsumaru, T. Discrimination of emotion from movement and addition of emotion in movement to improve human-coexistence robot’s personal affinity. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2009), Toyama, Japan, 27 September–2 October 2009; pp. 387–394. [Google Scholar] [CrossRef]
- Matsumaru, T. Discrimination and Implementation of Emotions on Zoomorphic Robot Movements. SICE J. Control Meas. Syst. Integr. 2009, 2, 365–372. [Google Scholar] [CrossRef] [Green Version]
- Plutchik, R. Chapter 1-A General Psychoevolutionary Theory of Emotion. In Emotion: Theory, Research and Experience, Vol. 1: Theories of Emotion; Plutchik, P., Kellerman, H., Eds.; Academic Press: New York, NY, USA, 1980; pp. 3–33. [Google Scholar] [CrossRef]
- Plutchik, R. A Psychoevolutionary Theory of Emotions. Soc. Sci. Inf. 1982, 21, 529–553. [Google Scholar] [CrossRef]
- Plutchik, P.; Conte, H.R. The circumplex as a general model of the structure of emotions and personality. In Circumplex Models of Personality and Emotions; Plutchik, P., Conte, H.R., Eds.; American Psychological Association: Washington, DC, USA, 1997; pp. 17–45. [Google Scholar] [CrossRef]
- Plutchik, P. The Nature of Emotions. Am. Sci. 2001, 89, 344–350. [Google Scholar] [CrossRef]
- Clavel, C.; Plessier, J.; Martin, J.-C.; Ach, L.; Morel, B. Combining Facial and Postural Expressions of Emotions in a Virtual Character. In Intelligent Virtual Agents. IVA 2009 [Amsterdam, The Netherlands], (14–16 September 2009); Lecture Notes in Computer Science (LNCS); Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjalmsson, H.H., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5773, pp. 287–300. [Google Scholar] [CrossRef]
- Takahashi, K.; Hosokawa, M.; Hashimoto, M. Remarks on designing of emotional movement for simple communication robot. In Proceedings of the 2010 IEEE International Conference on Industrial Technology (ICIT 2010), Via del Mar, Chile, 14–17 March 2010; pp. 585–590. [Google Scholar] [CrossRef]
- Ekman, P.; Friesen, W.V.; Ellsworth, P. Emotion in the Human Face: Guidelines for Research and an Integration of Findings; Pergamon Press: Oxford, UK, 1971; pp. 1–191. ISBN 978-0-08-016643-8. [Google Scholar] [CrossRef]
- Ekman, P. An argument for basic emotions. Cogn. Emot. 1992, 6, 169–200. [Google Scholar] [CrossRef]
- Samadani, A.-A.; DeHart, B.J.; Robinson, K.; Kulic, D.; Kubica, E.; Gorbet, R. A study of human performance in recognizing expressive hand movements. In Proceedings of the 20th IEEE International Workshop on Robot and Human Communication (RO-MAN 2011), Atlanta, GA, USA, 31 July–3 August 2011; pp. 93–100. [Google Scholar] [CrossRef] [Green Version]
- Samadani, A.-A.; Kubica, E.; Gorbet, R.; Kulic, D. Perception and Generation of Affective Hand Movements. Int. J. Soc. Robot. 2013, 5, 35–51. [Google Scholar] [CrossRef]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
- Barrett, L.F.; Russell, J.A. Independence and bipolarity in the structure of current affect. J. Personal. Soc. Psychol. 1998, 74, 967–984. [Google Scholar] [CrossRef]
- Russell, J.A.; Barrett, L.F. Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant. J. Personal. Soc. Psychol. 1999, 76, 805–819. [Google Scholar] [CrossRef]
- Russell, J.A.; Mehrabian, A. Evidence for a Three-Factor Theory of Emotions. J. Res. Personal. 1977, 11, 273–294. [Google Scholar] [CrossRef]
- Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
- Masuda, M.; Kato, S.; Itoh, H. Emotion Detection from Body Motion of Human Form Robot Based on Laban Movement Analysis. In Principles of Practice in Multi-Agent Systems; Lecture Notes in Computer Science; Yang, J.-J., Yokoo, M., Ito, T., Jin, Z., Scerri, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5925, pp. 322–334. [Google Scholar] [CrossRef]
- Nakagawa, K.; Shinozawa, K.; Ishiguro, H.; Akimoto, T.; Hagita, N. Motion modification method to control affective nuances for robots. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2009), St. Louis, MI, USA, 11–15 October 2009; pp. 5003–5008. [Google Scholar] [CrossRef]
- Glowinski, D.; Dael, N.; Camurri, A.; Volpe, G.; Mortillaro, M.; Scherer, K. Toward a Minimal Representation of Affective Gestures. IEEE Trans. Affect. Comput. 2011, 2, 106–118. [Google Scholar] [CrossRef] [Green Version]
- Banziger, T.; Martellato, M.; Scherer, K.R. Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion 2012, 12, 1161–1179. [Google Scholar] [CrossRef]
- Dael, N.; Mortillaro, M.; Scherer, K.R. Emotion expression in body action and posture. Emotion 2012, 12, 1085–1101. [Google Scholar] [CrossRef] [Green Version]
- Claret, J.-A.; Venture, G.; Basanez, L. Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task. Int. J. Soc. Robot. 2017, 9, 277–292. [Google Scholar] [CrossRef] [Green Version]
- Zeccal, M.; Endo, N.; Momoki, S.; Itoh, K.; Takanishi, A. Design of the humanoid robot KOBIAN-preliminary analysis of facial and whole body emotion expression capabilities. In Proceedings of the 8th IEEE-RAS International Conference on Humanoid Robots (Humanoids 2008), Daejeon, Korea, 1–3 December 2008; pp. 487–492. [Google Scholar] [CrossRef]
- Zecca, M.; Mizoguchi, Y.; Endo, K.; Iida, F.; Kawabata, Y.; Endo, N.; Itoh, K.; Takanishi, A. Whole body emotion expressions for KOBIAN humanoid robot-preliminary experiments with different Emotional patterns. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2009), Toyama, Japan, 27 September–2 October 2009; pp. 381–386. [Google Scholar] [CrossRef]
- Kim, W.H.; Park, J.W.; Lee, W.H.; Kim, W.H.; Chung, M.J. Synchronized multimodal expression generation using editing toolkit for a human-friendly robot. In Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO 2009), Guilin, China, 19–23 December 2009; pp. 706–710. [Google Scholar] [CrossRef]
- Kim, W.H.; Park, J.W.; Lee, W.H.; Chung, M.J. Robot’s emotional expression generation based on context information and combination of behavior database. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication (ROMAN 2010), Viareggio, Italy, 12–15 September 2010; pp. 316–323. [Google Scholar] [CrossRef]
- Li, J.; Chignell, M. Communication of Emotion in Social Robots through Simple Head and Arm Movements. Int. J. Soc. Robot. 2011, 3, 125–142. [Google Scholar] [CrossRef]
- Erden, M.S. Emotional Postures for the Humanoid-Robot Nao. Int. J. Soc. Robot. 2013, 5, 441–456. [Google Scholar] [CrossRef] [Green Version]
- Coulson, M. Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. J. Nonverbal Behav. 2004, 28, 117–139. [Google Scholar] [CrossRef]
- McColl, D.; Nejat, G. Recognizing Emotional Body Language Displayed by a Human-like Social Robot. Int. J. Soc. Robot. 2014, 6, 261–280. [Google Scholar] [CrossRef]
- de Meijer, M. The contribution of general features of body movement to the attribution of emotions. J. Nonverbal Behav. 1989, 13, 247–268. [Google Scholar] [CrossRef]
- Wallbott, H.G. Bodily expression of emotion. Eur. J. Soc. Psychol. 1998, 28, 879–896. [Google Scholar] [CrossRef]
- Takahashi, Y.; Kayukawa, Y.; Terada, K.; Inoue, H. Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma Game. Int. J. Soc. Robot. 2021, 13, 1777–1786. [Google Scholar] [CrossRef]
- de Melo, C.M.; Carnevale, P.; Gratch, J. The Influence of Emotions in Embodied Agents on Human Decision-Making. In Intelligent Virtual Agents; Lecture Notes in Computer Science; Allbeck, J., Badler, N., Bickmore, T., Pelachaud, C., Safonova, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; Volume 6356, pp. 357–370. [Google Scholar] [CrossRef]
- Karg, M.; Schwimmbeck, M.; Kuhnlenz, K.; Buss, M. Towards mapping emotive gait patterns from human to robot. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication (ROMAN 2010), Viareggio, Italy, 12–15 September 2010; pp. 258–263. [Google Scholar] [CrossRef]
- Saerbeck, M.; Bartneck, C. Perception of affect elicited by robot motion. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2010), Osaka, Japan, 2–5 March 2010; pp. 53–60. [Google Scholar] [CrossRef]
- Watson, D.; Clark, L.A.; Tellegen, A. Development and Validation of Brief Measures of Positive and Negative Affect: The PANAS Scales. J. Personal. Soc. Psychol. 1988, 54, 1063–1070. [Google Scholar] [CrossRef]
- Crawford, J.R.; Henry, J.D. The Positive and Negative Affect Schedule (PANAS): Construct validity, measurement properties and normative data in a large non-clinical sample. Br. J. Clin. Psychol. 2004, 43, 245–265. [Google Scholar] [CrossRef]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Instruction Manual and Affective Ratings; Technical Report A-6; University of Florida, The Center for Research in Psychophysiology: Gainesville, FL, USA, 2005. [Google Scholar]
- Knight, H.; Thielstrom, R.; Simmons, R. Expressive path shape (swagger): Simple features that illustrate a robot’s attitude toward its goal in real time. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2016), Daejeon, Korea, 9–14 October 2016; pp. 1475–1482. [Google Scholar] [CrossRef]
- Ae, M.; Sakatani, Y.; Yokoi, T.; Hashihara, Y.; Shibukawa, K. Biomechanical Analysis of the Preparatory Motion for Takeoff in the Fosbury Flop. Int. J. Sport Biomech. 1986, 2, 66–77. [Google Scholar] [CrossRef]
- Reitsma, P.S.A.; Andrews, J.; Pollard, N.S. Effect of Character Animacy and Preparatory Motion on Perceptual Magnitude of Errors in Ballistic Motion. Comput. Graph. Forum 2008, 27, 201–210. [Google Scholar] [CrossRef] [Green Version]
- Shiraki, Y.; Yamamoto, S.; Kushiro, K. Effects of Different Modes of Preparatory Motion on Dart-Throwing Performance. Compr. Psychol. 2015, 4, 12. [Google Scholar] [CrossRef]
- Takayama, L.; Dooley, D.; Ju, W. Expressing thought: Improving robot readability with animation principles. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2011), Lausanne, Switzerland, 6–9 March 2011; pp. 69–76. [Google Scholar] [CrossRef]
- Gielniak, M.J.; Thomaz, A.L. Generating anticipation in robot motion. In Proceedings of the 20th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2011), Atlanta, GA, USA, 31 July–3 August 2011; pp. 449–454. [Google Scholar] [CrossRef] [Green Version]
- Kovar, L.; Gleicher, M.; Pighin, F. Motion Graphs. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques (ACM SIGGRAPH’02), San Antonio, TX, USA, 23–26 July 2002; pp. 473–482. [Google Scholar] [CrossRef]
- Tanaka, K.; Nishikawa, S.; Kuniyoshi, Y. Effect of preliminary motions on agile motions. In Proceedings of the 16th International Conference on Advanced Robotics (ICAR 2013), Montevideo, Uruguay, 25–29 November 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Wortham, R.H.; Theodorou, A.; Bryson, J.J. Robot Transparency: Improving Understanding of Intelligent Behaviour for Designers and Users. In Towards Autonomous Robotic Systems; Lecture Notes in Computer Science; Gao, Y., Fallah, S., Jin, Y., Lekakou, C., Eds.; Springer: Cham, Switzerland, 2017; Volume 10454, pp. 274–289. [Google Scholar] [CrossRef] [Green Version]
- Martinez, J.; Black, M.J.; Romero, J. On Human Motion Prediction Using Recurrent Neural Networks. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (IEEE CVPR 2017), Honolulu, HI, USA, 21–26 July 2017; pp. 4674–4683. [Google Scholar] [CrossRef] [Green Version]
- Barsoum, E.; Kender, J.; Liu, Z. HP-GAN: Probabilistic 3D Human Motion Prediction via GAN. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW2018), Salt Lake City, UT, USA, 18–22 June 2018; pp. 1499–149909. [Google Scholar] [CrossRef] [Green Version]
- Chiu, H.-K.; Adeli, E.; Wang, B.; Huang, D.-A.; Niebles, J.C. Action-Agnostic Human Pose Forecasting. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV 2019), Waikoloa, HI, USA, 7–11 January 2019; pp. 1423–1432. [Google Scholar] [CrossRef] [Green Version]
- Wu, E.; Koike, H. FuturePose-Mixed Reality Martial Arts Training Using Real-Time 3D Human Pose Forecasting With a RGB Camera. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV 2019), Waikoloa, HI, USA, 7–11 January 2019; pp. 1384–1392. [Google Scholar] [CrossRef]
- Wu, E.; Koike, H. FuturePong: Real-time Table Tennis Trajectory Forecasting using Pose Prediction Network. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA’20), Honolulu, HI, USA, 25–30 April 2020; pp. 1–8. [Google Scholar] [CrossRef]
- Xu, C.; Fujiwara, M.; Makino, Y.; Shinoda, H. Investigation of Preliminary Motions from a Static State and Their Predictability. J. Robot. Mechatron. 2021, 33, 537–546. [Google Scholar] [CrossRef]
- Wakita, Y.; Hirai, S.; Suehiro, T.; Hori, T.; Fujiwara, K. Information Sharing via Projection Function for Coexistence of Robot and Human. Auton. Robot. 2001, 10, 267–277. [Google Scholar] [CrossRef]
- Machino, T.; Iwaki, S.; Kawata, H.; Yanagihara, Y.; Nanjo, Y.; Shimokura, K. Remote-collaboration system using mobile robot with camera and projector. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation (IEEE ICRA 2006), Orlando, FL, USA, 15–19 May 2006; pp. 4063–4068. [Google Scholar] [CrossRef]
- Lee, J.-H. Human Centered Ubiquitous Display in Intelligent Space. In Proceedings of the 33rd Annual Conference of the IEEE Industrial Electronics Society (IEEE IECON 2007), Taipei, Taiwan, 5–8 November 2007; pp. 22–27. [Google Scholar] [CrossRef]
- Shiotani, T.; Maegawa, K.; Iwamoto, K.; Lee, J.-H. Building a behavior model for the Ubiquitous Display to be used in a large-scale public facility. In Proceedings of the 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2012), Daejeon, Korea, 26–28 November 2012; pp. 228–233. [Google Scholar] [CrossRef]
- Kirby, R.; Simmons, R.; Forlizzi, J. COMPANION: A Constraint-Optimizing Method for Person-Acceptable Navigation. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009), Toyama, Japan, 27 September–2 October 2009; pp. 607–612. [Google Scholar] [CrossRef] [Green Version]
- Matsumaru, T. Informative Motion Study to Improve Human-Coexistence Robot’s Personal Affinity. In Proceedings of the IEEE RO-MAN 2009 Workshop on Robot Human Synergies, Toyama, Japan, 28 September 2009; pp. 1–5. [Google Scholar]
- Dragan, A.D.; Lee, K.C.T.; Srinivasa, S.S. Legibility and predictability of robot motion. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (ACM/IEEE HRI 2013), Tokyo, Japan, 3–6 March 2013; pp. 301–308. [Google Scholar] [CrossRef] [Green Version]
- Dragan, A.D.; Bauman, S.; Forlizzi, J.; Srinivasa, S.S. Effects of Robot Motion on Human-Robot Collaboration. In Proceedings of the 10th ACM/IEEE International Conference on Human-Robot Interaction (ACM/IEEE HRI 2015), Portland, OR, USA, 2–5 March 2015; pp. 51–58. [Google Scholar] [CrossRef] [Green Version]
- Stulp, F.; Grizou, J.; Busch, B.; Lopes, M. Facilitating intention prediction for humans by optimizing robot motions. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2015), Hamburg, Germany, 28 September–2 October 2015; pp. 1249–1255. [Google Scholar] [CrossRef] [Green Version]
- Matsumaru, T. Handover movement informing receiver of weight load as informative motion study for human-friendly robot. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009), Toyama, Japan, 27 September–2 October 2009; pp. 299–305. [Google Scholar] [CrossRef]
- Koene, A.; Remazeilles, A.; Prada, M.; Garzo, A.; Puerto, M.; Endo, S.; Wing, A.M. Relative importance of spatial and temporal precision for user satisfaction in human-robot object handover interactions. In Proceedings of the Third International Symposium on New Frontiers in Human Robot Interaction 2014, 50th Annual Convention of the Society for the Study of Artificial Intelligence and the Simulation of Behaviour (AISB-50), London, UK, 1–4 April 2014; pp. 1–7. [Google Scholar]
- Kshirsagar, A.; Kress-Gazit, H.; Hoffman, G. Specifying and Synthesizing Human-Robot Handovers. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2019), Macau, China, 3–8 November 2019; pp. 5930–5936. [Google Scholar] [CrossRef]
- Maler, O.; Nickovic, D. Monitoring Temporal Properties of Continuous Signals. In Formal Techniques, Modelling and Analysis of Timed and Fault-Tolerant Systems; Lecture Notes in Computer Science; Lakhnech, Y., Yovine, S., Eds.; Springer: Cham, Switzerland, 2004; Volume 3253, pp. 152–166. [Google Scholar] [CrossRef] [Green Version]
- Aleotti, J.; Micelli, V.; Caselli, S. An Affordance Sensitive System for Robot to Human Object Handover. Int. J. Soc. Robot. 2014, 6, 653–666. [Google Scholar] [CrossRef]
- Aleotti, J.; Rizzini, D.L.; Caselli, S. Object categorization and grasping by parts from range scan data. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation (IEEE ICRA 2012), Saint Paul, MN, USA, 14–18 May 2012; pp. 4190–4196. [Google Scholar] [CrossRef]
- Chan, W.P.; Nagahama, K.; Yaguchi, H.; Kakiuchi, Y.; Okada, K.; Inaba, M. Implementation of a framework for learning handover grasp configurations through observation during human-robot object handovers. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (IEEE Humanoids 2015), Seoul, Korea, 3–5 November 2015; pp. 1115–1120. [Google Scholar] [CrossRef]
- Chan, W.P.; Pan, M.K.X.J.; Croft, E.A.; Inaba, M. Characterization of handover orientations used by humans for efficient robot to human handovers. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2015), Hamburg, Germany, 28 September–2 October 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Chan, W.P.; Pan, M.K.X.J.; Croft, E.A.; Inaba, M. An Affordance and Distance Minimization Based Method for Computing Object Orientations for Robot Human Handovers. Int. J. Soc. Robot. 2020, 12, 143–162. [Google Scholar] [CrossRef]
- Suay, H.B.; Sisbot, E.A. A position generation algorithm utilizing a biomechanical model for robot-human object handover. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (IEEE ICRA 2015), Seattle, WA, USA, 26–30 May 2015; pp. 3776–3781. [Google Scholar] [CrossRef]
- Parastegari, S.; Abbasi, B.; Noohi, E.; Zefran, M. Modeling human reaching phase in human-human object handover with application in robot-human handover. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2017), Vancouver, BC, Canada, 24–28 September 2017; pp. 3597–3602. [Google Scholar] [CrossRef]
- Han, Z.; Yanco, H. The Effects of Proactive Release Behaviors During Human-Robot Handovers. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (ACM/IEEE HRI 2019), Daegu, Korea, 11–14 March 2019; pp. 440–448. [Google Scholar] [CrossRef]
- Matsumaru, T. Design and Evaluation of Throw-over Movement Informing a Receiver of Object Landing Distance. In Advances in Robotics—Modeling, Control and Applications; Ciufudean, C., Garcia, L., Eds.; iConcept Press: Hong Kong, China, 2013; pp. 171–194. [Google Scholar]
- Lombai, F.; Szederkenyi, G. Throwing motion generation using nonlinear optimization on a 6-degree-of-freedom robot manipulator. In Proceedings of the 2009 IEEE International Conference on Mechatronics (IEEE ICM 2009), Malaga, Spain, 14–17 April 2009; pp. 1–6. [Google Scholar] [CrossRef]
- Yedeg, E.L.; Wadbro, E. State constrained optimal control of a ball pitching robot. Mech. Mach. Theory 2013, 69, 337–349. [Google Scholar] [CrossRef]
- Mulling, K.; Kober, J.; Kroemer, O.; Peters, J. Learning to select and generalize striking movements in robot table tennis. Int. J. Robot. Res. 2013, 32, 263–279. [Google Scholar] [CrossRef]
- Zeng, A.; Song, S.; Lee, J.; Rodriguez, A.; Funkhouser, T. TossingBot: Learning to Throw Arbitrary Objects with Residual Physics. IEEE Trans. Robot. 2020, 36, 1307–1319. [Google Scholar] [CrossRef]
- Matsumaru, T.; Hagiwara, K. Preliminary-announcement and display for translation and rotation of human-friendly mobile robot. In Proceedings of the 10th IEEE International Workshop on Robot and Human Interactive Communication (IEEE ROMAN 2001), Bordeaux and Paris, France, 18–21 September 2001; pp. 213–218. [Google Scholar] [CrossRef]
- Matsumaru, T.; Endo, H.; Ito, T. Examination by software simulation on preliminary-announcement and display of mobile robot’s following action by lamp or blowouts. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation (IEEE ICRA 2013), Taipei, Taiwan, 14–19 September 2003; Volume 1, pp. 362–367. [Google Scholar] [CrossRef]
- Muramatsu, S.; Higashi, S.; Chugo, D.; Yokota, S.; Hashimoto, H. Consideration of the preliminary announcement function for the human friendly service robot. In Proceedings of the 42nd Annual Conference of the IEEE Industrial Electronics Society (IECON 2016), Florence, Italy, 23–26 October 2016; pp. 5868–5872. [Google Scholar] [CrossRef]
- Kannan, S.S.; Lee, A.; Min, B.-C. External Human-Machine Interface on Delivery Robots: Expression of Navigation Intent of the Robot. In Proceedings of the 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN 2021), Vancouver, BC, Canada, 8–12 August 2021; pp. 1305–1312. [Google Scholar] [CrossRef]
- Matsumaru, T.; Iwase, K.; Akiyama, K.; Kusada, T.; Ito, T. Mobile Robot with Eyeball Expression as the Preliminary-Announcement and Display of the Robot’s Following Motion. Auton. Robot. 2005, 18, 231–246. [Google Scholar] [CrossRef]
- Lu, D.V.; Smart, W.D. Towards more efficient navigation for robots and humans. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), Tokyo, Japan, 3–7 November 2013; pp. 1707–1713. [Google Scholar] [CrossRef]
- Yamashita, S.; Ikeda, T.; Shinozawa, K.; Iwaki, S. Evaluation of Robots that Signals a Pedestrian Using Face Orientation Based on Moving Trajectory Analysis. In Proceedings of the 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN 2019), New Delhi, India, 14–18 October 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Matsumaru, T. Mobile Robot with Preliminary-announcement and Indication Function of Forthcoming Operation using Flat-panel Display. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation (IEEE ICRA 2007), Rome, Italy, 10–14 April 2007; pp. 1774–1781. [Google Scholar] [CrossRef]
- Coovert, M.D.; Lee, T.; Shindev, I.; Sun, Y. Spatial augmented reality as a method for a mobile robot to communicate intended movement. Comput. Hum. Behav. 2014, 34, 241–248. [Google Scholar] [CrossRef]
- Shrestha, M.C.; Kobayashi, A.; Onishi, T.; Yanagawa, H.; Yokoyama, Y.; Uno, E.; Schmitz, A.; Kamezaki, M.; Sugano, S. Exploring the use of light and display indicators for communicating directional intent. In Proceedings of the 2016 IEEE International Conference on Advanced Intelligent Mechatronics (IEEE AIM 2016), Banff, AB, Canada, 12–15 July 2016; pp. 1651–1656. [Google Scholar] [CrossRef]
- Shrestha, M.C.; Onishi, T.; Kobayashi, A.; Kamezaki, M.; Sugano, S. Communicating Directional Intent in Robot Navigation using Projection Indicators. In Proceedings of the 27th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2018), Nanjing, China, 27–31 August 2018; pp. 746–751. [Google Scholar] [CrossRef]
- Huy, D.Q.; Vietcheslav, I. See-through and spatial augmented reality-a novel framework for human-robot interaction. In Proceedings of the 2017 3rd International Conference on Control, Automation and Robotics (IEEE ICCAR 2017), Nagoya, Japan, 24–26 April 2017; pp. 719–726. [Google Scholar] [CrossRef]
- Chadalavada, R.T.; Andreasson, H.; Schindler, M.; Palm, R.; Lilienthal, A.J. Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human-robot interaction. Robot. Comput.-Integr. Manuf. 2020, 61, 101830. [Google Scholar] [CrossRef]
- Hetherington, N.J.; Croft, E.A.; Van der Loos, H.F.M. Hey Robot, Which Way Are You Going? Nonverbal Motion Legibility Cues for Human-Robot Spatial Interaction. IEEE Robot. Autom. Lett. 2021, 6, 5010–5015. [Google Scholar] [CrossRef]
- Matsumaru, T.; Kusada, T.; Iwase, K. Mobile Robot with Preliminary-Announcement Function of Forthcoming Motion using Light-ray. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2006), Beijing, China, 9–15 October 2006; pp. 1516–1523. [Google Scholar]
- Matsumaru, T. Mobile Robot with Preliminary-announcement and Display Function of Forthcoming Motion using Projection Equipment. In Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication (IEEE ROMAN 2006), Hatfield, UK, 6–8 September 2006; pp. 443–450. [Google Scholar] [CrossRef]
- Matsumaru, T. Experimental Examination in simulated interactive situation between people and mobile robot with preliminary-announcement and indication function of upcoming operation. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation (IEEE ICRA 2008), Pasadena, CA, USA, 19–23 May 2008; pp. 3487–3494. [Google Scholar] [CrossRef]
- Chadalavada, R.T.; Andreasson, H.; Krug, R.; Lilienthal, A.J. That’s on my mind! robot to human intention communication through on-board projection on shared floor space. In Proceedings of the 2015 European Conference on Mobile Robots (ECMR 2015), Lincoln, UK, 2–4 September 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Watanabe, A.; Ikeda, T.; Morales, Y.; Shinozawa, K.; Miyashita, T.; Hagita, N. Communicating robotic navigational intentions. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE/RSJ IROS 2015), Hamburg, Germany, 28 September–2 October 2015; pp. 5763–5769. [Google Scholar] [CrossRef]
- Rosen, E.; Whitney, D.; Phillips, E.; Chien, G.; Tompkin, J.; Konidaris, G.; Tellex, S. Communicating Robot Arm Motion Intent Through Mixed Reality Head-Mounted Displays. In Robotics Research (18th ISRR); Amato, N., Hager, G., Thomas, S., Torres-Torriti, M., Eds.; Springer: Cham, Switzerland, 2017; pp. 301–316. [Google Scholar] [CrossRef] [Green Version]
- Walker, M.; Hedayati, H.; Lee, J.; Szafir, D. Communicating Robot Motion Intent with Augmented Reality. In Proceedings of the 13th ACM/IEEE International Conference on Human-Robot Interaction (ACM/IEEE HRI 2018), Chicago, IL, USA, 5–8 March 2018; pp. 316–324. [Google Scholar] [CrossRef]
- Dezeen. Umbrellium Develops Light-Up Crossing That Only Appears When Needed. Available online: https://www.dezeen.com/2017/10/12/umbrellium-develops-interactive-road-crossing-that-only-appears-when-needed-technology/ (accessed on 6 May 2022).
- Correa, A.; Walter, M.R.; Fletcher, L.; Glass, J.; Teller, S.; Davis, R. Multimodal Interaction with an Autonomous Forklift. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (ACM/IEEE HRI 2010), Osaka, Japan, 2–5 March 2010; pp. 243–250. [Google Scholar] [CrossRef] [Green Version]
- Walter, M.R.; Antone, M.; Chuangsuwanich, E.; Correa, A.; Davis, R.; Fletcher, L.; Frazzoli, E.; Friedman, Y.; Glass, J.; How, J.P.; et al. A Situationally Aware Voice-Commandable Robotic Forklift Working Alongside People in Unstructured Outdoor Environments. J. Field Robot. 2015, 32, 590–628. [Google Scholar] [CrossRef] [Green Version]
- Florentine, E.; Andersen, H.; Ang, M.A.; Pendleton, S.D.; Fu, G.M.J.; Ang, M.H., Jr. Self-driving vehicle acknowledgement of pedestrian presence conveyed via Light-Emitting Diodes. In Proceedings of the 2015 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (IEEE HNICEM 2015), Cebu, Philippines, 9–12 December 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Florentine, E.; Ang, M.A.; Pendleton, S.D.; Andersen, H.; Ang, M.H., Jr. Pedestrian Notification Methods in Autonomous Vehicles for Multi-Class Mobility-on-Demand Service. In Proceedings of the Fourth International Conference on Human Agent Interaction (ACM HAI’16), Singapore, 4–7 October 2021; pp. 387–392. [Google Scholar] [CrossRef]
- Habibovic, A.; Lundgren, V.M.; Andersson, J.; Klingegard, M.; Lagstrom, T.; Sirkka, A.; Fagerlonn, J.; Edgren, C.; Fredriksson, R.; Krupenia, S.; et al. Communicating Intent of Automated Vehicles to Pedestrians. Front. Psychol. 2018, 9, 1336. [Google Scholar] [CrossRef]
- Dey, D.; Habibovic, A.; Pfleging, B.; Martens, M.; Terken, J. Color and Animation Preferences for a Light Band eHMI in Interactions Between Automated Vehicles and Pedestrians. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (ACM CHI 2020), Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar] [CrossRef]
- Ochiai, Y.; Toyoshima, K. Homunculus: The Vehicle as Augmented Clothes. In Proceedings of the 2nd Augmented Human International Conference (AH’11), Tokyo, Japan, 13 March 2011; pp. 1–4. [Google Scholar] [CrossRef]
- Mercedes-Benz. F 015 Luxury in Motion. Available online: https://www.mercedes-benz.com/en/innovation/autonomous/research-vehicle-f-015-luxury-in-motion/ (accessed on 6 May 2022).
- Mitsubishi Electric. Mitsubishi Electric Introduces Road-illuminating Directional Indicators. Available online: https://www.mitsubishielectric.com/news/2015/1023_zoom_01.html (accessed on 6 May 2022).
- de Clercq, K.; Dietrich, A.; Velasco, J.P.N.; de Winter, J.; Happee, R. External Human-Machine Interfaces on Automated Vehicles: Effects on Pedestrian Crossing Decisions. Hum. Factors 2019, 61, 1353–1370. [Google Scholar] [CrossRef] [Green Version]
- Matsumaru, T.; Kudo, S.; Endo, H.; Ito, T. Examination on a Software Simulation of the Method and Effect of Preliminary-announcement and Display of Human-friendly Robot’s Following Action. Trans. Soc. Instrum. Control Eng. 2004, 40, 189–198. (In Japanese) [Google Scholar] [CrossRef]
- Matsumaru, T. Development of Four Kinds of Mobile Robot with Preliminary-Announcement and Indication Function of Upcoming Operation. J. Robot. Mechatron. 2007, 19, 48–159. [Google Scholar] [CrossRef]
- Matsumaru, T. Evaluation Experiment in Simulated Interactive Situation between People and Mobile Robot with Preliminary-Announcement and Indication Function of Upcoming Operation. Trans. Hum. Interface Soc. 2008, 10, 11–20. (In Japanese) [Google Scholar] [CrossRef]
Model | Reference | Emotion |
---|---|---|
Pultchik’s wheel of emotions | [17,18,19,20] | Joy, anger, sadness, fear |
Ekman’s basic emotions | [23,24] | Joy, sadness, surprise, anger, fear, disgust |
Russell’s circumplex of affect | [27,28,29] | Positive/negative of valence (pleasantness), high/low of arousal (activation) |
Method | Study |
---|---|
LMA (Laban movement analysis) | [8,9,10] (1996, 1998, 2002) [11] (2000) [12] (2005) [13] (2010) [15,16] (2009) [22] (2010) |
BAP (Body action and posture) coding system | [36] (2012) |
Body language descriptors | [45] (2014) |
Method | Study |
---|---|
PCA (principal component analysis) + LDA (linear discrimination analysis) | [15,16] (2009) |
FPCA (functional principal component analysis) | [25,26] (2011, 2013) |
Method | Study |
---|---|
Feature superimposition | [15,16] (2009) [56] (2016) |
Mapping valence from level and arousal level to basic posture and joint velocity | [33] (2009) |
Translation from emotional information in PDA space to kinematic features in JVG space | [37] (2017) |
Imitation of human movement | [38,39] (2008, 2009) [40,41] (2009, 2010) [50] (2010) [42] (2011) [43] (2013) [45] (2014) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Matsumaru, T. Methods of Generating Emotional Movements and Methods of Transmitting Behavioral Intentions: A Perspective on Human-Coexistence Robots. Sensors 2022, 22, 4587. https://doi.org/10.3390/s22124587
Matsumaru T. Methods of Generating Emotional Movements and Methods of Transmitting Behavioral Intentions: A Perspective on Human-Coexistence Robots. Sensors. 2022; 22(12):4587. https://doi.org/10.3390/s22124587
Chicago/Turabian StyleMatsumaru, Takafumi. 2022. "Methods of Generating Emotional Movements and Methods of Transmitting Behavioral Intentions: A Perspective on Human-Coexistence Robots" Sensors 22, no. 12: 4587. https://doi.org/10.3390/s22124587
APA StyleMatsumaru, T. (2022). Methods of Generating Emotional Movements and Methods of Transmitting Behavioral Intentions: A Perspective on Human-Coexistence Robots. Sensors, 22(12), 4587. https://doi.org/10.3390/s22124587