Next Article in Journal
Kinematic Analysis of V-Belt CVT for Efficient System Development in Motorcycle Applications
Previous Article in Journal
A Study on the Cavitation and Pressure Pulsation Characteristics in the Impeller of an LNG Submerged Pump
Previous Article in Special Issue
Kineto-Static Analysis and Design Optimization of a 3-DOF Wrist Rehabilitation Parallel Robot with Consideration of the Effect of the Human Limb
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improving HRI with Force Sensing

by
Akiyoshi Hayashi
,
Liz Katherine Rincon-Ardila
and
Gentiane Venture
*
Department of Mechanical Systems Engineering, Tokyo University of Agriculture and Technology, Koganei 184-8588, Japan
*
Author to whom correspondence should be addressed.
Machines 2022, 10(1), 15; https://doi.org/10.3390/machines10010015
Submission received: 15 November 2021 / Revised: 8 December 2021 / Accepted: 19 December 2021 / Published: 24 December 2021
(This article belongs to the Special Issue Advances of Japanese Machine Design)

Abstract

:
In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.

1. Introduction

1.1. Background

AI technology and robotics technology are making significant progress. It is said that within a few decades, people and robots will work together, sharing the same workspace [1,2]. With these changes, robots will have more opportunities to work in close contact with humans, so there is a need to achieve smooth communication between people and robots [3]. The research field of human–robot-interaction is becoming critical. Human–human communication relies on visual information such as facial expressions, gaze, and gestures, or audio information such as words [4,5]. However, with the distance between humans and robots becoming shorter, we must start thinking about the possibility of people spontaneously making contact with robots during work [6]. Social roboticists must address the issues of safety and social acceptance of robots’ social interaction with humans [7]. While haptic research in robotics is an important field of research, touch, which transmits and induces affective content, has not received much attention in social robotics so far, particularly when analyzing how humans touch robots. The sense of touch is paramount both in physical and social interactions. Previous research suggests that contacts are rich in information. They can convey different emotions and have demonstrated that haptics effectively communicates valence and arousal and emotions of happiness, sadness, anger and fear [4,8]. Communication by contact is therefore essential for smooth interaction between robots and people. This work aims to improve the HRI by introducing communication by contact. In this paper, we present the results of our work on touch communication between robots and humans. We show the overall system under development in Figure 1. The present paper focuses on the “Sensing touch” and “Classifying touch gestures” blocks.

1.2. Related Works

The research on contact communication of interest can be divided into two main categories: human–human contact communication and human–robot contact communication.
Herstein et al. conducted human-to-human contact interaction experiments to understand how much the sense of touch can convey different emotions. In [9], subjects were touched from the elbow to the tip of the hand. Each touch conveyed a specified emotion. In this experiment, Herstein et al. showed that six emotions, anger, fear, disgust, love, gratitude, and sympathy, could be identified by the subjects with a probability of more than chance. They conducted a similar experiment where the area of contact was extended to the entire body. The results showed that subjects could discriminate eight types of emotions: anger, fear, disgust, love, gratitude, sympathy, happiness, and sadness from the tactile sensations and the location of contact with greater than chance probability [10]. Research on how a device conveys emotions through simple vibrotactile feedback was conducted in [11]. Minamizawa et al. recorded 28 vibration sample sets for four different emotions (joy, anger, sadness, relaxation) and replayed the vibrations to test how well they could be recognized. Their results show that vibration can be utilized as a medium for expression and recognition of the four selected emotions with the accuracy of 46.9–71.4%, respectively. Their found accuracy is similar to that obtained with other modalities such as facial expression, voice and whole body. They are averagely ranged from 56% to 70%.
On human–robot communication, Yohanan et al. created an animal-like robot called the “Haptic creature”, a sort of small lap pet stuffed animal with as few non-tactile artifacts as possible. They proposed a classification of 30 types of touch gestures through experiments using a Haptic creature similar to a small animal [5]. Andreasson et al. conducted an HRI experiment using Nao. In their study, emotions such as love, sympathy, gratitude, sadness, happiness, disgust, anger, and fear were transmitted to Nao through contact in random order. They found a correlation between emotion and frequency of contacts and tried to estimate the emotions [12].

2. Materials

2.1. Collecting Expressive Touch Gestures

It is difficult to estimate human emotions and messages directly from the contact force data obtained from the contact itself during hand touch. Previous researches suggested dividing it into two phases [13]. First, classifying touch gestures, which means how humans touch the robot. Second, interpreting them into messages or emotions [13].

2.2. PAD Emotional State Model

The PAD emotional state model is one of the psychological models to describe and measure emotional states. Contrarily to the Eckman basic emotions model, PAD uses three dimensions, pleasure, arousal, and dominance, to represent all emotions [14]. All three dimensions have values in the range −1 to 1. In this research, the PAD is used to check the subjects’ emotions. To make it easier for participants to evaluate their own emotions, the SAM (self-assessment manikin) scale was used [15]. It makes it intuitive and easy to evaluate each dimension using the visual scale presented in Figure 2.

2.3. Equipment

We used a 5-DOF robot arm (Figure 3). It is a custom-designed robot with Dynamixel servomotors (Robotis) in each joint of the robot arm. The robot is solely equipped with encoders and not equipped with a sensor to detect human or object contact (unless estimated through current measurement). To detect the contact location and intensity, sensors must be installed. To define the most appropriate location of the sensors, we conducted a first experiment where participants were invited to touch the robot, and we observed how and where they touched the inorganic non-zoomorphic robot arm.
In our research, we use two types of tactile sensors: ShokacChipTM and Xela Robotics Tactile Sensor. ShokacChipTM is a series of multi-axis tactile sensors developed by Touchence [16]. Three piezoelectric elements are processed and arranged in a three-dimensional axis to achieve high sensitivity in the three-dimensional axis [11]. Xela Robotics Tactile Sensor is a series of 3-axis tactile sensor modules developed by Xela Robotics [17]. In this study, we used a 4 × 4 arrangement of sensor cells that can sense the load in the 3-axis directions; these sensors are more versatile than the ShokacChipTM sensors, but their measures are affected by changes in the magnetic field and, therefore, cannot be used near the robot’s motors.

2.4. Touch Gestures

We use the 25-touch gestures with the original definition presented in Table 1. It is based on Yohanan’s touch gestures definitions. We omitted those irrelevant and added those needed for our specific robot arm. For classification, we labeled “shake” by pushing with a fingertip (shake 1) and “shake” by grabbing the robot arm with a hand (shake 2) separately. These touch gestures are performed on the robot arm’s body, and the location where participants touched the robot were recorded. The sensors were placed adequately for the second experiment, which is explained in Section 4. Examples of participants touching the robot can be seen. The participants touched the robot, as can be seen in Figure 4.

3. Experiment on How Humans Convey Emotion to Robot Arm

3.1. Method

There are various constraints to consider when installing sensors all over the body of the robot arm used in this research, such as the influence of the magnetic field of the motor and the range of motion of the joints. In addition, unlike the Nao and Haptic creature, it was not known how people would attempt to make contact with the inorganic non-zoomorphic robot arm, which was not designed in the HRI style. Therefore, before installing the sensors on the robot arm, we conducted an experiment to observe how and where people attempted to make contact with the robot arm used in this study. The experiment was conducted on seven participants based on Andreasson et al.’s experiment [10]. The participants were asked to express their feelings of love, sympathy, gratitude, sadness, happiness, disgust, anger, and fear to the robot arm in a random order using free contact and without a time limit. We used two cameras to capture the subject’s behaviors from different viewpoints (Figure 5). After the experiment, we visually evaluated from the recorded videos which part of the body the participants touched and what contact they were making when conveying each emotion. Each part of the robot arm was classified as a human arm, as shown in Figure 3b. After communicating each emotion to the robot arm, the participants were asked to quantify their emotions using the SAM to rate pleasure, arousal and dominance, from −1 to 1.

3.2. Results and Discussion

3.2.1. Location

Table 2 shows the percentage of participants who touched each of the areas of the robot, as shown in Figure 3. A result of 0% means that no participant touched the area, and 100% means that all participants touched the area. Only sadness is expressed as a percentage of the total number of participants because some participants could not express their feelings and did not touch the robot to express this emotion.
Our results show that the most touched parts of the body, regardless of the emotion, are the most distal parts of the robot and also the closest to the participant: the hand (438%), followed by the wrist (176%) and the forearm (162%). The central parts: the elbow and the upper arm are the least touched, with respectively 14% and 29%. This can be explained as these parts are the furthest parts from the participants and may require an additional effort to lean towards the robot.
The results also show that for positive emotions such as love (186%), sympathy (114%), and gratitude (157%), participants touched the robot more often than for negative emotions, except sadness (133%).
Andreasson’s research, which used the Nao robot, showed that the most plentiful interaction of the eight emotions resulted when the subjects touched Nao to convey love. A similar trend was observed in Table 2, with all parts of the robot touched but the shoulder. The article also said that the least plentiful interaction of the eight emotions resulted when the subjects touched Nao to convey disgust. Table 2 shows that the participants barely touched the robot arm when they conveyed disgust (85.7% in two places: hand and shoulder only). Our experiment has few participants but this result suggests the possibility of similar contact communication between robots with entirely different forms, such as Nao and the robot arm. These results also informed us where to position touch sensors for our experiments.

3.2.2. Classification of Touch Gestures

Table 3 shows the percentages of occurrences of each touch gesture observed for each emotion during our experiment. In particular, in the case of love, 57.1% of participants stroked (Stroke), and 14.3% of them tapped (Tap) the entire body of the robot arm except for the shoulder. On the other hand, in disgust, anger, and fear, which are negative emotions toward the partner, the touched points were mainly limited to the hand’s tip. The touch gestures observed were mainly aggressive, such as pinch, poke, and slap, while disgust and fear were limited to one or a few touches for all participants. These touch gestures were not used for the other emotions. They will be the marker for classification between positive emotions and negative ones. However, touch gestures such as pat, tap, and stroke are used to communicate multiple emotions such as love, sympathy, gratitude, sadness, and happiness. This result suggests that it is more difficult to estimate which emotion is being conveyed by a single touch gesture, even if the general type of emotion can be identified as being positive.
Yohanan surveyed in their experiment which touch gestures participants used when they conveyed each emotion by contact. In Yohanan’s experiment, they used nine emotions, distressed, aroused, excited, miserable, neutral, pleased, depressed, sleepy, relaxed. Comparing the result is difficult because the emotions used are different from those in this research, but pleased shows similarity with happy. When Yohanan’s subjects would convey “pleased” to the Haptic creature, they would mainly use stroke, hug, hold, rub, pat, cradle, and tickle. Hug and cradle are not in Table 3 because they cannot be performed on the robot arm. Stroke, massage, rub were used when participants in our experiment conveyed happiness, but tickle and pat were not. One of the possible reasons for the difference may be the robot’s shape and texture; our robot is made of aluminum, while the Haptic creature in Yohanan’s work is made of faux fur.
Table 4 shows the PAD’s median, average, standard deviation, variance of emotion that participants evaluated after conveying emotion by touch. Except for “sympathy”, almost all P, A, D variances are less than 0.10. Sympathy is a complex emotion depending on the condition of the other, and it may not be easy to feel sympathy with the robot arm. In the following description, sympathy is not taken into account.
Comparing the results in Table 3 and Table 4, when the participants performed “pinch”, “poke”, “press” and “slap”, which are rather aggressive touch gestures (Table 3), the PAD results in Table 4 show low “pleasure” and high “arousal”. “Love” and “happiness” show very similar PAD values with very high average values for pleasure: 0.71 and 0.79, respectively. However, “massage” and “rub” is not seen with “love” but only with “happiness”. In addition, “scratch”, similar to “rub”, is seen only in “love”.

4. Experiment Using Tactile Sensors

4.1. Method

After locating the sensors on the robot, we then conducted a second experiment. Each participant interacted with the robot arm during this experiment, as shown in Figure 2. Two cameras were used to capture the participants’ movements during the experiment. Fifteen participants communicated seven emotions (love, gratitude, sadness, happiness, disgust, anger, and fear) to the robot arm using free contact with no time limit. The only constraint was that they had to touch the robot on one of the sensors of their choice. The measured data are a time series of the forces. Each time an emotion was transmitted, the subject evaluated the transmitted emotion using the SAM scale. We conducted an evaluation practice before starting the experiment to ensure that the participants understood the SAM scale. For the touch gestures, the participants were presented with the definitions shown in Table 1, and the movements were specified by a video. To protect the sensor during the experiment, we incorporated a system that allows the robot to perform only one type of action to reject the gesture when the load exceeds the allowable limit.
For the SD method, the adjectives shown in Table 5 were used. Each adjective pair was divided into seven-point scales, and the participants were asked to evaluate their impressions of the robot. These adjective pairs were selected based on Kanda et al.’s study [18]. These adjectives pairs include factors such as familiarity and approachability with the robot and adjectives that evaluate effects based on the subject’s pleasure and displeasure.
Based on the results of the previous section, 2 ShokaChipTM and 2 Xela robotics tactile sensors are installed on the robot arm’s hand and forearm (Figure 6). Solidworks were used to design part of the setting sensors.

4.2. Classifying Touch Gestures

We used two different types of sensors. Both types of sensors output 3-dimensional force data over time. From these force profiles, we analyzed and classified touch gestures quantitatively. Since data for each sensor have their own specificities, we used two different architectures for the touch gesture classification.

4.2.1. Classifying Touch Gestures with ShokacChipTM Sensors

A 1D CNN was used for the model, as shown in Figure 7. The chosen optimizer is Adam with a “sparse_categorical_crossentropy” loss function. Data of thirteen participants accounting for 2505 training data points and 412 test data were randomly selected to be used. The number of epochs is 20 and the training time is 20 s using GeForce RTX 2080. The hyper parameters were set using particle swarm optimization. The learning rate is 0.01. Hyper parameters are given in Table 6.

4.2.2. Classifying Touch Gestures with Xela Robotics Tactile Sensors

Since the data acquired by the Xela robotics tactile sensor can be treated as video, as shown in Figure 8, we used a 3-D CNN, as shown in Figure 9. We used the same optimizer and loss function as with the other model. Data of thirteen participants accounting for 2580 training data points and 389 test data were randomly selected. The number of epochs is 55 and the training time is 108 s using GeForce RTX 2080. The hyper parameters and the number of epochs were set using particle swarm optimization. The learning rate is 0.01. Hyper parameters for this model are shown in Table 7.

4.3. Results of Touch Gestures’ Classification

The confusion matrices for each sensor are shown in Figure 10 and Figure 11.
For the ShokacChip sensor, the overall correct classification rate for the training was 49.5%, which is higher than chance. Looking at each label, tickle and slap are classified with high probability, followed by pat and shake_1. On the other hand, labels that are close to the action itself, such as massage and shake_2 that use both hands, hold by hands and hold by hands gently, and rub and scratch, are misclassified with high probability. For example, massage and shake _2 are actions in which one hand grasps a set of sensors and repeatedly increases or decreases the force, so there was no significant difference between them. In the case of hold by hands and hold by hands gently, the only difference was in the strength of the gestures, so we believe that the misclassification was caused by the fact that each subject had different criteria for the weak and strong forces. In the case of strokes and taps, both gestures are very short, so the differences in their features could not be read well.
For the Xela sensor, the overall correct classification rate is 75%, which is much higher than chance. For each label, we can classify stroke without any misclassification, and almost all other labels except grab, hold by hands gently, hold by hands pair, and tap and slap can be classified with high accuracy. The misclassification of grab, hold by hands gently, hold by hands, and tap and slap can be attributed to the fact that each subject had different criteria for weak and strong forces. In addition, there was a difference between the grab and the two hold gestures, which were performed with one hand and with both hands.
While our dataset is not very large and adding more data may improve the accuracy, for the future development of our overall control system, we do not require to recognize with high certainty the type of touch that has been performed at all times. The obtained accuracy is sufficient.

4.4. Changing Impressions of Robots through Conveying Emotions by Touch

We conducted a t-test to examine whether there was a significant difference in the results of the SD questionnaire before and after the experiment. The null hypothesis is that the experiment does not change the feeling of participants.
The results are shown in Table 8. A positive value of the average shows that participants’ feelings are towards the left-column adjectives in Table 5, which express positive feelings. A negative value of the average shows that the participants’ feelings are towards the right-column adjectives in Table 5, which express negative feelings. The results in Table 8 show that the score differences for all adjective pairs except for “interesting-boring” are positive. After participants touched the robot arm, their feelings toward the robot were more positive.
The p-value was examined at the significance level (p < 0.05). As Table 8 shows, the null hypothesis was rejected for six adjective pairs, “friendly–unfriendly”, “safe–dangerous”, “warm–cold”, “casual–formal”, “approachable–unapproachable”, “amusing–obnoxious”. It also shows that the impression of participants changed to a more positive one, with a very low p < 0.005 for three of the adjective pairs, “safe–dangerous”, “approachable–unapproachable”, “amusing–obnoxious”. In particular, the p-value of the adjective pair “casual” and “formal” was 0.00, suggesting that participants’ feeling of stiffness toward the robot was greatly improved, from “formal” to “casual”. In the other two adjective pairs, participants’ perception of the robot arm changed to that of an approachable and pleasant presence after the experiment.
Other research using PARO, a therapy robot, suggested that communication with PARO using touch improved the participants’ mood and pains [19]. While pain was not investigated in our study, the results show that touching the robot may play a significant role in improving the impression of the robot even with non-anthropomorphic or zoo-morphic structures.

5. Conclusions

In this study, we conducted an emotion transfer experiment and a touch gesture acquisition experiment on a robot arm equipped with a sensor. We then used machine learning to classify the touch gestures and succeeded in classifying them with a higher probability than chance. In addition, it was confirmed that the impression of the robot was improved by the actual touch. In this study, we conducted two experiments. The first one was to determine where to locate touch sensors on an inorganic non-zoomorphic robot. The second was to acquire affective touch gestures of participants when touching the robot equipped with sensors. We then used machine learning to classify the touch gestures and succeeded in classifying them with a higher probability than chance. It was also confirmed that the impression of the robot was improved by the actual touch.
Matthias Scheutz performed an experiment by creating a movie of robots touching humans and showed the participants to investigate the impact of touch on human–robot trust [6]. The participants’ evaluation of touch is not trustworthy when they are aware of a defined function of the robots. Their experiment method and results provide contrast to our study. We have to investigate not only touch from human to robot but also from robot to human.
Further work will consist of consolidating our findings. We will conduct a large-scale experiment using long-term interaction with the robot, and we will add some movements to the robot in response to touch in order to complete the scheme proposed in Figure 1.

Author Contributions

Conceptualization, A.H..; methodology, A.H. and L.K.R.-A. and G.V.; software, A.H. and L.K.R.-A.; validation, L.K.R.-A. and G.V.; formal analysis, A.H.; resources, G.V.; writing—original draft preparation, A.H.; writing—review and editing, G.V.; supervision, G.V.; project administration, G.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to the internal regulation of Tokyo University of Agriculture and Technology, that waived the need for formal approval for such experiments conducted in accordance with the ethics guideline.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not available.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Demir, K.A.; Döven, G.; Sezen, B. Industry 5.0 and Human-Robot Co-working. Procedia Comput. Sci. 2019, 158, 688–695. [Google Scholar] [CrossRef]
  2. Rodriguez-Guerra, D.; Sorrosal, G.; Cabanes, I.; Calleja, C. Human-Robot Interaction Review: Challenges and Solutions for Modern Industrial Environments. IEEE Access 2021, 9, 108557–108578. [Google Scholar] [CrossRef]
  3. Future, M.R. Collaborative Robot Market Research Report by Global Forecast to 2023. Available online: https://www.marketresearchfuture.com/reports/collaborative (accessed on 21 January 2021).
  4. Admoni, H.; Scassellati, B. Social Eye Gaze in Human-Robot Interaction: A Review. J. Human-Robot Interact. 2017, 6, 25–63. [Google Scholar] [CrossRef] [Green Version]
  5. Yohanan, S.; MacLean, K.E. The Role of Affective Touch in Human-Robot Interaction: Human Intent and Expectations in Touching the Hapitac Creature. Int. J. Soc. Robot. 2012, 4, 163–180. [Google Scholar] [CrossRef]
  6. Law, T.; Malle, B.F.; Scheutz, M. A Touching Connection: How Observing Robotic Touch Can Affect Human Trust in a Robot. Int. J. Soc. Robot. 2021, 13, 2003–2019. [Google Scholar] [CrossRef]
  7. Cabibihan, J.-J.; Ahmed, I.; Ge, S.S. Force and motion analyses of the human patting gesture for robotic social touching. In Proceedings of the 2011 IEEE 5th International Conference on Cybernetics and Intelligent Systems (CIS), Qingdao, China, 17–19 September 2021; pp. 165–169. [Google Scholar]
  8. Eid, M.A.; Al Osman, H. Affective Haptics: Current Research and Future Directions. IEEE Access 2015, 4, 26–40. [Google Scholar] [CrossRef]
  9. Herstein, M.J.; Holmes, R.; McCullough, M.; Keltner, D. Touch communicates distinct emotions. Am. Psychol. Assoc. 2006, 6, 528–533. [Google Scholar]
  10. Hertenstein, M.J.; Holmes, R.; McCullough, M.; Keltner, D. The communication of emotion via touch. Am. Psychol. Assoc. 2009, 9, 566–573. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Ju, Y.; Zheng, D.; Hynds, D.; Chernyshov, G.; Kunze, K.; Minamizawa, K. Haptic Empathy: Conveying Emotional Meaning through Vibrotactile Feedback. In Proceedings of the Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–7. [Google Scholar] [CrossRef]
  12. Andreasson, R.; Alenljung, B.; Billing, E.; Lowe, R. Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot. Int. J. Soc. Robot. 2018, 10, 473–491. [Google Scholar] [CrossRef] [Green Version]
  13. Jung, M.M.; Poel, M.; Poppe, R.; Heylen, D.K.J. Automatic recognition of touch gestures in the corpus of social touch. J. Multimodal User Interfaces 2017, 11, 81–96. [Google Scholar] [CrossRef] [Green Version]
  14. Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
  15. Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
  16. Touchence. Available online: http://touchence.jp/products/chip.html (accessed on 14 November 2021).
  17. Xela. Robotics. Available online: https://xelarobotics.com/ja/xr1944 (accessed on 14 November 2021).
  18. Kanda, T.; Ishiguro, H.; Ono, T.; Imai, M.; Nakatsu, R. An evaluation on interaction between humans and an autonomous robot Robovie. J. Robot. Soc. Jpn. 2002, 20, 315–323. [Google Scholar] [CrossRef] [Green Version]
  19. Geva, N.; Uzefovsky, F.; Levy-Tzedek, S. Touching the social robot PARO reduces pain perception and salivary oxytocin levels. Sci. Rep. 2020, 10, 9814. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The proposed overall touch communication system with the two-step expressive gestures estimation. In this paper, we present the touch sensing and the classifying touch gesture blocs.
Figure 1. The proposed overall touch communication system with the two-step expressive gestures estimation. In this paper, we present the touch sensing and the classifying touch gesture blocs.
Machines 10 00015 g001
Figure 2. SAM (self-assessment manikin) scale. Adapted with permission from Ref. [15]. Copyright 1994 Peter Lang.
Figure 2. SAM (self-assessment manikin) scale. Adapted with permission from Ref. [15]. Copyright 1994 Peter Lang.
Machines 10 00015 g002
Figure 3. (a) Robot arm used for the experiment (body number is used in experiments), (b) robot arm part names.
Figure 3. (a) Robot arm used for the experiment (body number is used in experiments), (b) robot arm part names.
Machines 10 00015 g003
Figure 4. Touch gesture examples on the sensor by one participant. (a) Tickle, (b) hold double.
Figure 4. Touch gesture examples on the sensor by one participant. (a) Tickle, (b) hold double.
Machines 10 00015 g004
Figure 5. Snapshots of the experiment.
Figure 5. Snapshots of the experiment.
Machines 10 00015 g005
Figure 6. Robot arm equipped with the sensors, used to collect touch behaviors.
Figure 6. Robot arm equipped with the sensors, used to collect touch behaviors.
Machines 10 00015 g006
Figure 7. The CNN model used for ShokacChipTM data.
Figure 7. The CNN model used for ShokacChipTM data.
Machines 10 00015 g007
Figure 8. Image of sensor data.
Figure 8. Image of sensor data.
Machines 10 00015 g008
Figure 9. The model used for Xela sensor data.
Figure 9. The model used for Xela sensor data.
Machines 10 00015 g009
Figure 10. Fifteen touch gestures recognition convolution matrix: ShokacChipTM, average rate: 49%.
Figure 10. Fifteen touch gestures recognition convolution matrix: ShokacChipTM, average rate: 49%.
Machines 10 00015 g010
Figure 11. Sixteen touch gesture recognition convolution matrix: Xela robotics tactile sensor, average rate: 75%.
Figure 11. Sixteen touch gesture recognition convolution matrix: Xela robotics tactile sensor, average rate: 75%.
Machines 10 00015 g011
Table 1. Touch gestures dictionary [4].
Table 1. Touch gestures dictionary [4].
TouchDefinition
Grab Grasp or seize the robot arm suddenly and roughly.
Hold by hands Put the robot arm between your both flat hands firmly
Hold by hands gently Put the robot arm between your both flat hands gently
Massage Rub or knead the robot arm with your hands.
NuzzleGently rub or push against the robot arm with your nose.
Pat Gently and quickly touch the robot arm with the flat of your hand.
Pinch Tightly and sharply grip the robot arm between your fingers and thumb.
Poke Jab or prod the robot arm with your finger.
Press Exert a steady force on the robot arm with your flattened fingers or hand.
Rub Move your hand repeatedly back and forth on the fur of the robot arm with firm pressure.
Tap Strike the robot arm with a quick light blow or blows using one or more fingers.
Tickle Touch the robot arm with light finger movements.
Tremble Shake against the robot arm with a slight rapid motion.
Grab Grasp or seize the robot arm suddenly and roughly.
Hold by hands Put the robot arm between your both flat hands firmly
Hold by hands gently Put the robot arm between your both flat hands gently
Massage Rub or knead the robot arm with your hands.
Pat Gently and quickly touch the robot arm with the flat of your hand.
Pinch Tightly and sharply grip the robot arm between your fingers and thumb.
Poke Jab or prod the robot arm with your finger.
Press Exert a steady force on the robot arm with your flattened fingers or hand.
Rub Move your hand repeatedly back and forth on the fur of the robot arm with firm pressure.
ScratchRub the robot arm with your fingernails.
ShakePress intermittently the robot arm with your fingers (Shake_1)
or Hold the robot arm with your hand and move it up and down (Shake_2)
SlapQuickly and sharply strike the robot arm with your open hand.
SqueezeFirmly press the robot arm between yourfingers or both hands.
Stroke Move your hand with gentle pressure over the robot arm, often repeatedly.
Table 2. Contact zones of the robot, as described in Figure 3b, that participants touched to express their emotions (unit: %).
Table 2. Contact zones of the robot, as described in Figure 3b, that participants touched to express their emotions (unit: %).
Body Part Emotion
TotalLoveSympathyGratitudeHappinessSadnessDisgustAngerFear
Hand438.171.442.9100.014.366.757.171.414.3
Wrist176.342.914.328.628.633.30.014.314.3
Forearm16242.928.628.628.633.30.00.00.0
Elbow14.314.30.00.00.00.00.00.00.0
Upper arm28.614.314.30.00.00.00.00.00.0
Shoulder71.50.014.30.014.30.028.60.014.3
Total890.8185.8114.4157.285.8133.385.785.742.9
Table 3. Percentage: touch gestures participants use in each emotion (unit: %).
Table 3. Percentage: touch gestures participants use in each emotion (unit: %).
GestureEmotion
LoveSympathyGratitudeSadnessHappinessDisgustAngerFear
Hold0014.314.30000
Massage014.3014.314.3000
Nuzzle0014.300000
Pat28.6014.328.60000
Pinch0000028.614.314.3
Poke0000028.614.328.6
Press0000014.342.90
Rub000028.6000
Scratch14.30000000
Shake14.3057.114.328.60014.3
Slap0000042.928.60
Squeeze00000014.30
Stroke57.128.614.342.928.6000
Tap14.328.614.3028.6000
Tickle28.60000000
No touch000000028.6
Massage014.3014.314.3000
Nuzzle0014.300000
Table 4. PAD of emotion that participants evaluate after touch.
Table 4. PAD of emotion that participants evaluate after touch.
MedianAverageStandard DeviationVariance
LoveP0.750.710.090.01
A0.250.140.520.27
D0.000.070.240.06
SympathyP0.00−0.070.350.12
A−0.25−0.250.410.17
D0.250.040.470.22
GratitudeP0.750.710.170.03
A0.250.250.320.10
D0.250.070.430.18
SadnessP−0.75−0.680.240.06
A−0.50−0.500.290.08
D−0.50−0.500.350.13
HappinessP0.750.790.170.03
A0.500.430.370.14
D0.250.250.320.10
DisgustP−0.75−0.710.300.09
A0.250.250.290.08
D0.500.430.350.12
AngryP−0.75−0.680.120.01
A0.750.680.190.04
D0.500.610.240.06
FearP−0.50−0.610.130.02
A0.250.140.400.16
D−0.75−0.640.130.02
Table 5. Adjectives pairs used in the SD questionnaire [18].
Table 5. Adjectives pairs used in the SD questionnaire [18].
Adjectives Pairs Used in SD Questionnaire
Gentle-Scary
Pleasant-Unpleasant
Friendly-Unfriendly
Safe-Dangerous
Warm-Cold
Cute-Hateful
Casual-Formal
Easy to understand-Difficult to understand
Approachable-Unapproachable
Cheerful-Gloomy
Considerate-Selfish
Funny-Unfunny
Amusing-Obnoxious
Likeable-Dislikeable
Interesting-Boring
Good-Bad
Table 6. Hyper parameters of 1-dimension CNN for ShokacChipTM.
Table 6. Hyper parameters of 1-dimension CNN for ShokacChipTM.
Layer TiypeActivationKernel SizePaddingOutput ShapeParam
Convolution 1DRelu5same(None, 180, 32)992
Max pooling 1D---(None, 90, 32)0
Convolution 1DRelu3same(None, 90, 32)3104
Max pooling 1D---(None, 45, 32)0
Convolution 1DRelu3same(None, 22, 32)3104
Max pooling 1D---(None, 45, 32)0
Global average 1D---(None, 32)0
DenseRelu--(None, 32)1056
DenseSoftmax--(None, 16)495
Table 7. Hyper parameters of 3-dimension CNN.
Table 7. Hyper parameters of 3-dimension CNN.
Layer TiypeActivationKernel SizePaddingOutput ShapeParam
Convolution 3DRelu(5, 2, 2)Same(None, 180, 8, 4, 16)976
Convolution 3DRelu(5, 2, 2)Same(None, 180, 8, 4, 16)5136
Max pooling 3D---(None, 90, 8, 4, 16)0
Convolution 3DRelu(3, 2, 2)Same(None, 90, 8, 4, 32)6176
Convolution 3DRelu(3, 2, 2)same(None, 90, 8, 4, 32)12,320
Max pooling 3D---(None, 45, 8, 4, 32)0
Convolution 3DRelu(3, 2, 2)Same(None, 45, 8, 4, 32)12,320
Global average 3D---(None, 32)0
DenseRelu--(None, 32)1056
DenseRelu--(None, 32)1056
DenseSoftmax--(None, 16)528
Table 8. Result of SD questionnaire (**: p-value < 0.005, *: p-value < 0.05).
Table 8. Result of SD questionnaire (**: p-value < 0.005, *: p-value < 0.05).
Adjective Pairs (+1–−1)Average SD Method ValuesDifferencep-Value
Before ExperimentAfter Experiments
Gentle–Scary0.03840.1540.120.196
Pleasant–Unpleasant0.2120.3270.120.236
Friendly–Unfriendly *0.09610.4040.310.01
Safe–Dangerous **0.03850.4420.40.004
Warm–Cold *−0.03850.2690.310.03
Cute–Hateful0.3270.4420.120.221
Casual–Formal **0.1350.3650.50
Easy–Difficult (to understand)0.01920.2880.270.055
Approachable–Unapproachable **0.01920.4420.420.003
Cheerful–Gloomy0.1540.2120.060.583
Considerate–Selfish0.1150.1350.020.853
Funny–Unfunny0.3650.50.130.157
Amusing–Obnoxious *0.250.4810.230.006
Likeable–Dislikeable0.4420.6150.170.112
Interesting–Boring0.5380.519−0.020.868
Good–Bad0.4230.4810.060.575
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hayashi, A.; Rincon-Ardila, L.K.; Venture, G. Improving HRI with Force Sensing. Machines 2022, 10, 15. https://doi.org/10.3390/machines10010015

AMA Style

Hayashi A, Rincon-Ardila LK, Venture G. Improving HRI with Force Sensing. Machines. 2022; 10(1):15. https://doi.org/10.3390/machines10010015

Chicago/Turabian Style

Hayashi, Akiyoshi, Liz Katherine Rincon-Ardila, and Gentiane Venture. 2022. "Improving HRI with Force Sensing" Machines 10, no. 1: 15. https://doi.org/10.3390/machines10010015

APA Style

Hayashi, A., Rincon-Ardila, L. K., & Venture, G. (2022). Improving HRI with Force Sensing. Machines, 10(1), 15. https://doi.org/10.3390/machines10010015

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop