Next Article in Journal
Weyl, Majorana and Dirac Fields from a Unified Perspective
Previous Article in Journal
A Survey of Public Key Infrastructure-Based Security for Mobile Communication Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Facial Feature Movements Caused by Various Emotions: Differences According to Sex

1
Department of Computer Science, Graduate School, Sangmyung University, 7 Hongji-Dong, Jongro-Ku, Seoul 110-743, Korea
2
Department of Computer Science, Sangmyung University, 7 Hongji-Dong, Jongro-Ku, Seoul 110-743, Korea
*
Author to whom correspondence should be addressed.
Symmetry 2016, 8(9), 86; https://doi.org/10.3390/sym8090086
Submission received: 28 March 2016 / Revised: 22 August 2016 / Accepted: 22 August 2016 / Published: 30 August 2016

Abstract

:
Facial muscle micro movements for eight emotions were induced via visual and auditory stimuli and were verified according to sex. Thirty-one main facial features were chosen from the Kinect API out of 121 initially obtained facial features; the average change of pixel value was measured after image alignment. The proposed method is advantageous as it allows for comparisons. Facial micro-expressions are analyzed in real time using 31 facial feature points. The amount of micro-expressions for the various emotion stimuli was comparatively analyzed for differences according to sex. Men’s facial movements were similar for each emotion, whereas women’s facial movements were different for each emotion. The six feature positions were significantly different according to sex; in particular, the inner eyebrow of the right eye had a confidence level of p < 0.01. Consequently, discriminative power showed that men’s ability to separate one emotion from the others was lower compared to women’s ability in terms of facial expression, despite men’s average movements being higher compared to women’s. Additionally, the asymmetric phenomena around the left eye region of women appeared more strongly in cases of positive emotions.

1. Introduction

Emotion estimation methods have been studied recently in order to help develop affective smart content. To estimate emotions, bio-signals or bio-images extracted from several body parts can be utilized and these are used to measure intrinsic and extrinsic parameters of human emotion.
In previous research, physiological signal-based methods, such as Electrocardiography (ECG), Photoplethysmography (PPG), Electroencephalography (EEG), and galvanic skin response (GSR), are used for the quantitative estimations of emotion [1,2,3,4]. However, as these methods involve attaching sensors to the body, this inconvenience can inadvertently cause a negative emotion that creates noise in the readings. Similarly, eye movement recognition methods also include equipment to be attached, such as a head-mounted device (HMD) [5]. In our approach, to avoid mis-estimation due to the attached sensor, image-based method is used for emotion estimation.
Image-based emotion recognition methods have been studied previously, and include gestures, and eye and facial expressions [6,7,8]. Many approaches of gesture recognition exist but there are difficulties in the quantitative measurement and the summary of several features. Eye image based methods such as using blink rate or pupil size require equipment such as a high-resolution camera, infrared illuminator, and zoom lens. Therefore, the facial expression recognition method is more convenient and simple as a camera based method.
From a different viewpoint, facial expressions have been defined since the 1960s, with Ekman and Friesen defining facial muscle movement and their relationships [9]. From that, the Facial Action Coding System (FACS) was created, which has been designated as the “first face emotion map”. However, FACS has not been verified; it has only been described. Therefore, camera-based analysis is appropriate to assist its verification.
In previous work, the facial expression recognition method has been studied using the Active Appearance Model (AAM) algorithm [10]. Its features are extracted based on Principal Component Analysis (PCA) [11]. Since this method has a long processing time, it is difficult to perform in real time. Additionally, when using eigen-points, the method has some constraints; for instance, face movements should be exaggerated and illumination conditions must be equally maintained [12]. To estimate social emotions, previous research has measured the amount of movement on the upper body [13]. This is calculated by subtracting two successive images, and this method does not consider non-linear facial movements. Another previous study approached facial expression recognition by using facial movement features [14]. This study analyzed facial muscle movement in static images, which could not consider the facial element and any muscle movements.
Eckhard Hess, a psychologist from the University of Chicago, defined the characteristics of pupil accommodation for each sex [15]; he said pupil size could change even with the same light intensity. He then proposed that pupil size could change when looking at attractive or hostile objects; he found followings: First, when men looked at an image, such as a building and landscape, pupil size did not change. However, the pupil rapidly changed when men looked at a woman’s naked body. Regarding women, a naked male body and smiling baby images generated pupil change. In these studies, differences according to sex can be measured. Many researchers have studied the differences in emotional expression according to sex. Partala and Surakka found differences between men and women by estimated pupil accommodation [7]; when specific emotions were elicited, both sexes’ pupil sizes were larger compared to the neutral state. Go et al. found that when emotions changed, women’s expressions were more clearly classified compared to men’s expressions at the speech signal [16]. Another study involved an analysis of three dimensions: convert responding, interpersonal expression, and attitude [17]. In this research, interpersonal expression—the difference of representative power among individuals—was not considered. An attitude analysis could be discussed, such as subjective valuation.
Several studies used facial expressions to measure features for each sex; for example, one study determined differences according to sex regarding whether facial expressions were correctly chosen [16]. Women had a higher rate of correct classification, distinguishing one emotion from another. The research analyzed only two emotions (valence and arousal), and it is insufficient to standardize characteristic of facial expressions. In other research, they used visual stimuli independently to elicit emotion, and the same result was derived [18]. Previous studies verified that using only one stimulus was insufficient to elicit a specific emotion [19]. Therefore, using a single modality was incorrect for emotion recognition. To distinguish sex characteristics, previous research used both facial images and speech signals [20]. Facial feature vectors were obtained using linear discriminant analysis (LDA) [21]. Speech signals were obtained using a wavelet sub-band. The facial feature vectors and speech signals were merged. This research performed better compared to the previous studies; however, LDA, wavelet transform, and merging methods require a long processing time.
To supplement the weaknesses of previous studies, 31 points that present facial significant movement based on camera images were designated. To overcome the limitation that facial expression was recognized only in terms of extreme expressions, an analysis of movement using the 31 points was performed at 49 Hz by using Kinect camera.
We designed the experimental analysis in three phases. First, we analyzed the amount of facial movement, and then compared this variation between men and women. Second, we used t-test to verify statistically significant differences according to sex. We confirmed significant difference points between men and women at each emotion. Third, we analyzed the discriminative power of distinguishing one emotion from another for each sex. We used eight emotions for this research: Amusement, Excitement, Contentment, Awe, Fear, Angry, Disgust, and Sad. To elicit specific emotions, both visual and auditory stimuli were given to participants.

2. Related Works for Asymmetry of Facial Expression

Many studies regarding the facial asymmetry according to emotional facial expressions exist. In existing studies, the subject’s handedness and sex have been mentioned as factors that affect facial asymmetry. First, from the viewpoint of handedness, although it has been revealed that handedness acts on laterality in relation to linguistic functions [22], nothing has been found yet regarding the laterality relationship between emotional facial expressions and handedness despite that it has been studied by many researchers. On the other hand, many studies regarding right hemisphere processes argue that since nonlinguistic functions are based on the right hemisphere, they may not be related with handedness. From the viewpoint of sex, in the middle of controversies over the roles of sex as a variable that acts of laterality data [23,24], there are grounds for the arguments that the lateralization patterns of males and females are different from each other. In relation to linguistic processing, several studies have argued that females have weaker laterality than males [25,26]. On the contrary, studies regarding emotional processing have argued that females may have stronger laterality than males [27,28,29].
Existing studies regarding expressions according to emotions can be largely divided into those that analyzed volitional expressions and those that analyzed spontaneous expressions.

2.1. Volitional Facial Expression

First, there are studies that analyzed volitional facial expressions. These studies conducted experiments by requesting study subjects to make certain facial expressions. Borod and Caron advised that negative facial expressions are made more frequently by the left face while positive facial expressions are relatively less lateralized [30]. In addition, Borod et al. reported that, whereas male facial expressions were leftward lateralized, in the case of females, negative emotion facial expressions were leftward lateralized while positive emotion facial expressions were rightward lateralized [31]. Campbell indicated that people tended to make facial expressions with their left faces more frequently than their right faces and he again reported that asymmetry appeared on left-handers’ left faces more prominently than right-handers [32,33]. Heller and Levy argued that happiness appeared on the left faces of both right-handers and left-handers [34]. Sackeim and Cur argued that asymmetry was severer when intense emotions such as disgust and anger were expressed and changes in the left face were larger when negative emotions were expressed than when positive emotions were expressed [35]. In addition, Cacioppo and Petty, and Knox argued that no difference was found between the two sides of the face [36,37].
To review the positive emotions indicated in the abovementioned studies in detail, some studies argued that when right-handed adults were requested to express volitional smiles or happiness, facial expressions might appear asymmetrically on the left or right face. In addition, some other studies reported that leftward asymmetry appeared in facial expressions of happiness in both right-handers and left-handers. Several other positive emotions showed identically ambiguous results. No significant difference in the expressions of flirtation, clowning, sexual arousal, or astonishment/surprise was found between the two sides of the face. However, some cases of left asymmetry were found in the case of males’ sexual arousal. To review the negative emotions in detail, whereas many studies argued that facial expressions of negative emotions were made on the left face, other studies failed to find any difference. Many cases of leftward asymmetry were reported in the facial expressions of disgust, disapproval, and confusion. Although Sackeim and Gur reported that leftward asymmetry appeared in the facial expressions of anger, the results of actual recalculation indicated that the difference was not large [38]. The results of analyses of other negative facial expressions are less clear. Whereas Borod and Caron, Borod et al. reported that leftward asymmetry in the facial expressions of sadness/grief, Cacioppo and Petty, Sackeim and Gur, and Strauss and Kaplan reported that no significant asymmetry appeared. No significant difference was found in a study that investigated facial expressions of fear. In summary, most studies found that left and right faces were involved in the facial expressions of positive emotions at the same frequencies and studies generally argued that the left face (right hemisphere) was more frequently involved in the facial expressions of negative emotions than the right face.

2.2. Spontaneous Facial Expression

There are studies on spontaneous facial expressions in which spontaneous facial expressions appearing when images were shown to subjects were experimented. Borod et al. argued that the left face moved frequently in the case of facial expressions of negative emotions and that in the case of facial expressions of positive emotions, whereas the left face was active in males, the facial expressions were not lateralized in the case of females [31]. Ekman et al. indicated that leftward asymmetry appeared more frequently in the case of volitional smiles than in the case of spontaneous smiles and that the number of emotion facial expressions where leftward asymmetry appeared were equal to the number of emotion facial expressions where rightward asymmetry appeared [39]. They also indicated that the results in relation to negative emotions were similar but the results could not be generalized because the numbers of data were small. As with most other studies, Lynn and Lynn argued that there was also no difference in asymmetry between the two sides of the face [40,41].
To review the above studies regarding positive emotions in detail, as for the facial expressions of positive emotions of normal subjects, whereas one study reported leftward asymmetry in male subjects, most other studies argued that the same number of cases of rightward asymmetry appeared as the number of cases of leftward asymmetry. To review the above studies regarding negative emotions in detail, although quite a few cases of leftward asymmetry were found by Borod et al., a study conducted by Ekman et al. reported that the same numbers of cases of asymmetry on both sides of the face appeared. However, these authors suggested that their results could not be considered reliable because the number of data on negative facial expressions was too small. In summary, in the case of positive emotions, most studies indicated that left and right faces were involved at the same frequencies but the results cannot be easily generalized because the number of studies on spontaneous facial expressions is very small and attempts to analyze spontaneous facial expressions according to emotional valence are quite insufficient.
In the case of the existing studies mentioned above, face asymmetry was evaluated firsthand by the observers and a limitation has existed in that there are few data that can be used to evaluate the results of these studies quantitatively objectively.

3. Materials and Methods

3.1. Face Feature Extraction and Measuring Facial Feature Changes

The Kinect face tracking SDK uses the Kinect coordinate system to output its 3D tracking [42]. The origin is located at the cameras optical center (sensor), Z-axis is pointing towards a user, Y-axis is pointing up as shown in Figure 1. The face tracking SDK tracks faces in 3D based on the AAM and depth data, and provides 121 feature points on a face. By using face tracking SDK, we detect face and those coordinates of the 121 facial feature points from successive two frames [(a) in Figure 2]. Among them, we selected 31 main significant points based on Ekman’s action units as shown in Figure 3b [(b) in Figure 2]. In the selection of 31 main points, we excluded facial contour points that are irrelevant to facial expression muscles.
To analyze the facial muscle movement between successive frames, we measure variations of a pixel value in each feature point. However, natural subtle vibrations of humans could be an inaccurate factor in the measurement of pixel values at the same position. Thus, we performed the alignment step named shift-matching scheme before the pixel value measurement [(c) in Figure 2] [43]. First, two surrounding regions (10 × 10 pixels) of each feature point were cropped in two successive frames. Then, we applied a local binary pattern (LBP) operator to define facial texture information [44]. The LBP is an efficient texture operator that assigns a label to pixels by thresholding the 3 × 3 neighborhood of each pixel based on the center pixel value and presents the result as a binary number as shown in Figure 4.
An ordered set of binary values can be expressed in decimal form as shown in Equation (1) [45]:
L B P ( x c , y c ) =   n = 0 7 s ( i n i c ) 2 n
where xc and yc denote vertical and horizontal coordinates of the center position, respectively. Also, ic and in denote the gray value of the center pixel and those of the eight neighbor pixels, respectively. The function s(x) is defined as:
s ( x ) =   { 1   i f   x 0 0   i f   x < 0
To calculate the Hamming distance, LBP codes are extracted from the two surrounding regions. Because these surrounding regions can be misaligned horizontally and vertically, −2–+2-pixel translation is considered for alignment. The translational factor for alignment is determined at the minimum Hamming distance position [46]. The Hamming distance (Dh) is the number of positions in which bit-vectors differ:
D h =   i = 1 n | x i y i |
The minimum Hamming distance means that the two surrounding regions are almost the same. Figure 5 shows the conceptual diagram of the facial feature alignment method.
Then, the facial feature change (Ci) is calculated as follows [(d) in Figure 2]:
C i = 1 m n a = 0 n 1 b = 0 m 1 | I i ( a ,   b ) I i + 1 ( a , b ) |
In Equation (4), the m and n indicate the width and height of the overlapped region after alignment between the two surrounding regions, respectively (Figure 5).

3.2. Experiment Design

We tested 30 participants (15 men and 15 women) aged between 24 and 35 years old (men: Mean = 27.8, Standard deviation = 2.9; women: Mean = 27.0, Standard deviation = 2.5). We paid each participant $20 for participating and advised them to be prepared for the experiments by resting sufficiently. To elicit emotion, we used two kinds of stimuli. In previous research, the elicited emotion power was estimated using an EEG signal at three modalities: auditory, visual, and combined visual with auditory [19]. Using both visual and auditory stimuli is most efficient for eliciting emotion; therefore, we used a combined visual and auditory modality. We used visual stimuli of artistic photography taken from a photo-sharing site (http://musicovery.com). The artistic photographs were obtained using the emotion categories designated by the artist who uploaded the photo. Additionally, the artistic photographs were verified in terms of their emotion classification at low-level features [47]. These photos show a similar effect to the International Affective Picture System (IAPS), which is most commonly used for visual stimuli [48]. Auditory stimuli were downloaded from a music-sharing site (http://www.deviantart.com). The auditory stimuli at the music-sharing site were located based on Russell’s 2D emotion model, including the arousal-relaxation axis and positive-negative axis. We downloaded the music corresponding with the location of that emotion. Then, we verified the validity of the auditory stimuli via self-assessment. The auditory stimuli validity was verified at 98% reliability. We used a total of eight emotions. Figure 6 shows the mapping using Russell’s 2D emotion model.
We displayed 50 visual stimuli as a slideshow for each emotion. Each participant used an earpiece to concentrate on the auditory stimulus. The Kinect camera was located at a center position under the monitor in order to capture the front of the face. To remove the order effect, the sequence of eliciting emotion was placed randomly [49]. Participants had 2 min of rest time between emotions. Experimental procedure is shown in Figure 7.
Because facial movements are different between individuals, we first measured a case of neutral emotions; this allowed us to minimize the individual variation by removing the neutral emotion at the elicited specific emotion.

4. Results

4.1. Facial Movement Analysis for Men and Women

First, we analyzed the amount of facial movement for each sex across eight emotions.

4.1.1. Men’s Facial Movement Results

Figure 8 shows analysis results of the men’s facial feature movements; the squares indicate significant points that have higher movements compared to the average of the entire points of the both sexes (detailed in Table 1).
The pattern of men’s facial feature movements was similar in all emotions. In particular, Amusement, Excitement, Contentment, Awe, and Sad showed entirely the same distribution. Fear and Disgust showed particularly high movement at point 14. In particular, the results showed that the facial feature movements of men were symmetrical for all emotional stimuli.

4.1.2. Women’s Facial Movement Results

Figure 9 shows analysis results of the women’s facial feature movements; the squares indicate significant points that had higher movements compared to the average of the entire points of both sexes (detailed in Table 2).
While the facial movement distribution of men was similar for all emotions, women’s distribution showed diversity for all emotions. Some points showed a commonly high movement, such as points 2, 11, 12, 13, 22, 24, and 26. In particular, unlike the men’s results, the facial feature movements of women were asymmetric for most emotion stimuli. The facial features around the left eye showed greater movements compared to the right eye side for all emotional stimuli as shown in Figure 9.

4.1.3. Statistical Differences between Men and Women

We analyzed the statistical significant difference for each sex. We used a t-test to measure the statistical difference; significantly high facial movement was found for both sexes in Section 3.1 (Figure 8 and Figure 9). Figure 10 shows the results of the t-test between men and women.
The squares and triangles indicate significant differences at confidence levels of p < 0.01 and p < 0.1, respectively (Figure 10). Point 4 shows a commonly significant difference at a confidence level of p < 0.01 except for Amusement. Fear, Disgust, and Sad had similar facial movement distributions for the difference. Using the t-test, we verified the objective difference between men and women.

4.2. Ability to Distinguish One Emotion from Another for Men and Women

We analyzed the ability of emotion classification between men and women.
Figure 11 shows the result when individual variation is removed; that is, a positive number means that facial movement shows a larger than neutral emotion and a negative number means that it shows a smaller than neutral emotion. Consequently, the eight emotions all have larger facial movements than neutral. The facial movement average of men is higher compared to women for all emotions. Therefore, men’s facial expressions were more abundant compared to women’s. However, it is insufficient to measure this only using the average. Therefore, we looked at the standard deviations, which were 0.07 for men and 0.13 for women. Women showed a higher amount of distinguishing of emotions compared to men, despite women’s facial movement average being lower compared to men’s. In a previous study, women had more abundant emotion expressions at the brain signal compared to men [50]. We identified a similar result from the standard deviation between the results for men and women.

5. Discussion

In this paper, we performed an experiment to verify 31 facial feature movements of men and women according to eight emotional stimuli. Then, the statistical significances of each facial feature for all emotion stimuli were validated. Additionally, facial expressiveness between men and women was analyzed by comparing the standard deviation of facial feature movement of each emotion.
First, men’s facial feature movements appeared mostly through the whole facial positions. Additionally, the facial feature movement tendencies were not clearly separable among different emotion stimuli. Only two positions, forehead center and right bottom cheek, showed greater movements for two emotions: fear and disgust.
Second, women showed separable facial feature movements according to the different emotions compared with men. Although several facial features showed equally significant movements for all emotions, the features around the eyes showed different movement tendencies for each emotion. Next, the difference of the facial feature movements between men and women for each emotion was confirmed by calculating statistical significances. For all emotions, the inner brow of right eye equally showed significant differences between men and women. Additionally, movements around the lip were also regarded as main feature positions in order to separate between men and women for each emotion.
According to the first and second results for men and women, the symmetry issue can be compared. As shown in Figure 8 and Figure 9, the facial feature movements of men and women were symmetrical and asymmetrical, respectively. In particular, the asymmetric phenomena of women appeared more strongly around the left eye region in cases of positive emotions, such as amusement, excitement, contentment, and awe, as shown in Figure 9. This result concurs with the seminal research of Ekman [39].
Third, the standard deviation of the average facial feature movements for each emotion was comparatively analyzed for each sex. A greater standard deviation for the average movements against each emotion means more expressiveness for each emotion. That is, different emotions reflect well into their (men’s or women’s) facial expressions. Although the men’s facial feature movement was greater than the women’s, the standard deviation for the women was almost twice the men’s standard deviation. Consequently, women’s emotion expressiveness through facial expressions is better compared to men’s.
In previous research, facial expression recognition against different emotions was actively studies based on psychological theories, such as Ekman’s facial action units [9]. However, there was no research comparing facial expressiveness against each emotion for men and women. Consequently, our results indicated that men use more facial muscles but women provide more discriminative facial expressions for each emotion.

6. Conclusions

We proposed a new method for verifying differences of facial movement according to sex. We designed the experimental analysis in three phases. First, we analyzed the amount of facial movement for each sex. The distributions of men’s significant facial movement points were similar for all emotions. Meanwhile, the distributions of women’s significant facial movement points appeared variously for all emotions. This means that women’s ability to distinguish one emotion from another in terms of facial movement is higher compared to men’s ability. Second, regarding the eliciting of specific emotions, facial movement was larger compared to neutral emotion for both men and women. We performed a t-test to verify the statistical significant differences between men and women. The inner eyebrow of the right eye shows a commonly significant region for all emotions. Excitement had the largest number of significant points, while Awe had the least. This result concurs with previous research results. The present results could help development of smart content based on emotion.
In future research, we plan to match facial movement with Ekman’s action units. Because the FACS was not verified but rather described, we will work to verify the FACS. Using the algorithm that we developed, we could analyze the direction and size of movement at particular regions. Further, we will develop content based on facial movements. Consequently, we expect the verification of intrinsic facial movements to contribute to not only many kinds emotional and affective applications [19,51], but also to the forensics and security fields [52].

Acknowledgments

This work was supported by the ICT R&D program of MSIP/IITP [R0126-15-1045, The development of technology for social life logging based on analysis of social emotion and intelligence of convergence content]. In addition, this research was supported by the Bio & Medical Technology Development Program of the NRF funded by the Korean government, MSIP (2016M3A9E1915855).

Author Contributions

E. Lee conceived and designed the experiments; Y. Kim performed the experiments; K. Suh and E. Lee analyzed the data; Y. Kim and E. Lee contributed reagents/materials/analysis tools; and K. Suh wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

  1. James, A.P. Heart rate monitoring using human speech spectral features. Hum. Cent. Comput. Inf. Sci. 2015, 5, 1–12. [Google Scholar] [CrossRef]
  2. Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [PubMed]
  3. Park, M.W.; Kim, C.J.; Hwang, M.; Lee, E.C. Individual emotion classification between happiness and sadness by analyzing photoplethysmography and skin temperature. In Proceedings of the 4th World Congress on Software Engineering, Beijing, China, 23–25 May 2013; pp. 190–194.
  4. Wu, N.; Jiang, H.; Yang, G. Emotion recognition based on physiological signals. In International Conference on Brain Inspired Cognitive Systems; Springer: Berlin/Heidelberg, Germany, 2012; pp. 311–320. [Google Scholar]
  5. Lee, E.; Heo, H.; Park, K. The comparative measurements of eyestrain caused by 2D and 3D displays. IEEE Trans. Consum. Electron. 2010, 53, 1677–1683. [Google Scholar] [CrossRef]
  6. Gunes, H.; Piccardi, M. Bi-model emotion recognition from expressive face and body gesture. J. Netw. Comput. Appl. 2007, 30, 1334–1345. [Google Scholar] [CrossRef]
  7. Partala, T.; Surakka, V. Pupil size variation as indication of affective processing. Int. J. Hum. Comput. Stud. 2003, 59, 185–198. [Google Scholar] [CrossRef]
  8. Song, Y.; Morency, L.; Davis, R. Learning a sparse codebook of facial and body microexpressions for emotion recognition. In Proceedings of the International Conference on Multimodal Interaction, Sydney, Australia, 9–13 December 2013; pp. 237–244.
  9. Ekman, P.; Friesen, W. Facial Action Coding System: A Technique for the Measurement of Facial Movement; Consulting Psychologists Press: Palo Alto, CA, USA, 1978. [Google Scholar]
  10. Heisele, B. Face recognition with support vector machines: Global versus component-based approach. In Proceedings of the International Conference on Computer Vision, Vancouver, BC, Canada, 7–14 July 2001; pp. 688–694.
  11. Jolliffe, I. Principal Component Analysis; John Wiley & Sons Inc.: New York, NY, USA; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  12. Hong, S.; Byun, H. Facial expression recognition using eigen-points. In Proceedings of the Korean Conference of the Korea Institute of Information Scientists and Engineers, Daejeon, Korea, 23–24 April 2004; Volume 3, pp. 817–819.
  13. Lee, E.; Whang, M.; Ko, D.; Park, S.; Hwang, S. A new social emotion estimating method by measuring micro movement of human bust. In Industrial Applications of Affective Engineering; Watada, J., Otani, T., Shiizuka, H., Lim, C.P., Lee, K.P., Eds.; Springer: London, UK, 2014; pp. 19–26. [Google Scholar]
  14. Zhang, L. Facial expression recognition using facial movement features. IEEE Trans. Biometr. Compend. 2011, 2, 219–229. [Google Scholar] [CrossRef] [Green Version]
  15. Hess, E.H. Attitude and pupil size. Sci. Am. 1965, 212, 46–54. [Google Scholar] [CrossRef] [PubMed]
  16. Go, H.J.; Kwak, K.C.; Lee, D.J.; Chun, M.G.; Chun, M. Emotion recognition from the facial image and speech signal. In Proceedings of the Society of Instrument and Control Engineers Annual Conference, Fukui, Japan, 4–6 August 2003; Volume 3, pp. 2890–2895.
  17. Allen, J.G.; Haccoun, D.M. Sex differences in emotionality: A multidimensional approach. Hum. Relat. 1976, 29, 711–722. [Google Scholar] [CrossRef]
  18. Montagne, B.; Kessels, R.P.; Frigerio, E.; de Haan, E.H.; Perrett, D.I. Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cogn. Process 2005, 6, 136–141. [Google Scholar] [CrossRef] [PubMed]
  19. Dammak, M.; Wali, A.; Alimi, A.M. Viewer’s affective feedback for video summarization. J. Inf. Process. Syst. 2015, 11, 76–94. [Google Scholar]
  20. Julian, T.; Bjorn, H. Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scand. J. Psychol. 2000, 41, 243–246. [Google Scholar]
  21. Sebastian, M.; Gunnar, R.; Jason, W.; Bernhard, S.; Klaus, M. Fisher discriminant analysis with kernels. In Proceedings of the IEEE Workshop on Neural Network for Signal Processing, Madison, WI, USA, 23–25 August 1999; pp. 41–48.
  22. Hardyck, C.; Petrinovich, L.F. Left-handedness. Psychol. Bull. 1977, 84, 385. [Google Scholar] [CrossRef] [PubMed]
  23. Safer, M.A. Sex and hemisphere differences in access to codes for processing emotional expressions and faces. J. Exp. Psychol. Gen. 1981, 110, 86. [Google Scholar] [CrossRef] [PubMed]
  24. Schweitzer, L.; Chacko, R. Cerebral lateralization: Relation to subject’s sex. Cortex 1980, 16, 559–566. [Google Scholar] [CrossRef]
  25. Inglis, J.; Lawson, J.S. Sex differences in the effects of unilateral brain damage on intelligence. Science 1981, 212, 693–695. [Google Scholar] [CrossRef] [PubMed]
  26. McGlone, J. Sex differences in brain asymmetry survive peer commentary! Behav. Brain Sci. 1980, 3, 251–263. [Google Scholar] [CrossRef]
  27. Ladavas, E.; Umiltà, C.; Ricci-Bitti, P.E. Evidence for sex differences in right-hemisphere dominance for emotions. Neuropsychologia 1980, 18, 361–366. [Google Scholar] [CrossRef]
  28. McKeever, W.F.; Dixon, M.S. Right-hemisphere superiority for discriminating memorized from nonmemorized faces: Affective imagery, sex, and perceived emotionality effects. Brain Lang. 1981, 12, 246–260. [Google Scholar] [CrossRef]
  29. Strauss, E.; Kaplan, E. Lateralized asymmetries in self-perception. Cortex 1980, 16, 289–293. [Google Scholar] [CrossRef]
  30. Borod, J.C.; Caron, H.S. Facedness and emotion related to lateral dominance, sex and expression type. Neuropsychologia 1980, 18, 237–242. [Google Scholar] [CrossRef]
  31. Borod, J.C.; Koff, E.; White, B. Facial asymmetry in posed and spontaneous expressions of emotion. Brain Cogn. 1983, 2, 165–175. [Google Scholar] [CrossRef]
  32. Campbell, R. Asymmetries in interpreting and expressing a posed facial expression. Cortex 1978, 14, 327–342. [Google Scholar] [CrossRef]
  33. Campbell, R. Asymmetry in facial expression. Science 1980, 29, 833–834. [Google Scholar]
  34. Heller, W.; Levy, J. Perception and expression of emotion in right-handers and left-handers. Neuropsychologia 1981, 19, 263–272. [Google Scholar] [CrossRef]
  35. Sackeim, H.A.; Gur, R.C. Lateral asymmetry in intensity of emotional expression. Neuropsychologia 1978, 16, 473–481. [Google Scholar] [CrossRef]
  36. Cacioppo, J.T.; Petty, R.E. Lateral asymmetry in the expression of cognition and emotion. J. Exp. Psychol. Hum. Percept. Perform. 1981, 7, 333. [Google Scholar] [CrossRef]
  37. Knox, K. An Investigation of Nonverbal Behavior in Relation to Hemisphere Dominance. Master’s Thesis, University of California, San Francisco, CA, USA, 1972. [Google Scholar]
  38. Fox, N.A.; Davidson, R.J. The Psychobiology of Affective Development (PLE: Emotion) (Volume 7); Psychology Press: London, UK, 2014. [Google Scholar]
  39. Ekman, P.; Hager, J.C.; Friesen, V. The symmetry of emotional and deliberate facial actions. Phsychophysiology 1981, 18, 101–106. [Google Scholar] [CrossRef]
  40. Lynn, J.G.; Lynn, D.R. Face-hand laterality in relation to personality. J. Abnorm. Soc. Psychol. 1938, 33, 291. [Google Scholar] [CrossRef]
  41. Lynn, J.G.; Lynn, D.R. Smile and hand dominance in relation to basic modes of adaptation. J. Abnorm. Soc. Psychol. 1943, 38, 250. [Google Scholar] [CrossRef]
  42. Abhijit, J. Kinect for Windows SDK Programming Guide; Packt Publishing Ltd.: Birmingham, UK, 2012. [Google Scholar]
  43. Kim, Y.; Kim, H.; Lee, E. Emotion classification using facial temporal sparsity. IJAER 2014, 9, 24793–24801. [Google Scholar]
  44. Ahonen, T. Face description with local binary patterns: Application to face recognition. Pattern Anal. Mach. Intell. IEEE Trans. 2006, 28, 2037–2041. [Google Scholar] [CrossRef] [PubMed]
  45. Lee, E.C.; Jung, H.; Kim, D. New finger biometric method using near infrared imaging. Sensors 2011, 11, 2319–2333. [Google Scholar] [CrossRef] [PubMed]
  46. Hamming, R. Error detecting and error correcting codes. Bell. Syst. Tech. J. 1950, 29, 147–160. [Google Scholar] [CrossRef]
  47. Jana, M.; Allan, H. Affective image classification using features inspired by psychology and art theory. In Proceedings of the International Conference on Multimedia, Firenze, Italy, 25–29 October 2010; pp. 83–92.
  48. Lang, P.; Bradley, M.; Cuthbert, B. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual; Technical Report No. A-8; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
  49. Easterbrook, J. The effect of emotion on cue utilization and the organization of behavior. Psychol. Rev. 1959, 66, 183–201. [Google Scholar] [CrossRef] [PubMed]
  50. Kret, M.; Gelder, B. A review on sex differences in processing emotional signals. Neuropsychologia 2012, 50, 1211–1221. [Google Scholar] [CrossRef] [PubMed]
  51. Rho, S.; Yeo, S. Bridging the semantic gap in multimedia emotion/mood recognition for ubiquitous computing environment. J. Supercomput. 2013, 65, 274–286. [Google Scholar] [CrossRef]
  52. Lee, J.; Chae, H.; Hong, K. A fainting condition detection system using thermal imaging cameras based object tracking algorithm. J. Converg. 2015, 6, 1–15. [Google Scholar] [CrossRef]
Figure 1. Flow chart of the measuring facial feature changes method.
Figure 1. Flow chart of the measuring facial feature changes method.
Symmetry 08 00086 g001
Figure 2. Camera space of Kinect face tracking [42].
Figure 2. Camera space of Kinect face tracking [42].
Symmetry 08 00086 g002
Figure 3. Facial feature points: (a) The initial 121 feature points provided by Kinect face tracking SDK; and (b) the 31 feature points chosen as significant facial expression features and its numbering.
Figure 3. Facial feature points: (a) The initial 121 feature points provided by Kinect face tracking SDK; and (b) the 31 feature points chosen as significant facial expression features and its numbering.
Symmetry 08 00086 g003
Figure 4. Example of an LBP operator [45].
Figure 4. Example of an LBP operator [45].
Symmetry 08 00086 g004
Figure 5. Example of facial feature alignment in one of the facial feature points.
Figure 5. Example of facial feature alignment in one of the facial feature points.
Symmetry 08 00086 g005
Figure 6. Used emotions in Russell’s emotion model.
Figure 6. Used emotions in Russell’s emotion model.
Symmetry 08 00086 g006
Figure 7. Experimental procedure.
Figure 7. Experimental procedure.
Symmetry 08 00086 g007
Figure 8. Facial movements for men (Red dotted line: vertical face center).
Figure 8. Facial movements for men (Red dotted line: vertical face center).
Symmetry 08 00086 g008
Figure 9. Facial movements for women (Red dotted line: vertical face center).
Figure 9. Facial movements for women (Red dotted line: vertical face center).
Symmetry 08 00086 g009
Figure 10. Result of t-test between men and women (Squares and triangles indicate significant difference at a confidence levels of p < 0.01 and p < 0.1, respectively).
Figure 10. Result of t-test between men and women (Squares and triangles indicate significant difference at a confidence levels of p < 0.01 and p < 0.1, respectively).
Symmetry 08 00086 g010
Figure 11. Average amount of facial movement at each emotion and standard deviation.
Figure 11. Average amount of facial movement at each emotion and standard deviation.
Symmetry 08 00086 g011
Table 1. Amount of men’s facial feature movements according to the eight emotional stimuli (Yellow filled cells: greater features than the average of the entire points of the both sexes).
Table 1. Amount of men’s facial feature movements according to the eight emotional stimuli (Yellow filled cells: greater features than the average of the entire points of the both sexes).
EmotionAmusementExcitementContentmentAweFearAngerDisgustSad
Feature #
No. 10.5671.1441.0510.9281.1810.3851.0911.203
No. 21.4372.2962.3592.0302.2621.9722.3572.367
No. 3−1.340−1.571−1.504−1.446−1.528−1.593−1.665−1.697
No. 41.7871.7911.6171.7112.1071.6742.1461.981
No. 51.5531.4541.2510.8761.1270.8101.1191.493
No. 61.8392.6062.2432.2512.5472.0822.0072.368
No. 7−0.739−0.272−0.350−0.548−0.879−0.802−0.848−0.411
No. 81.3051.5851.3601.2641.6881.2481.0191.401
No. 91.1991.4211.3701.2891.5341.1721.3501.415
No. 10−0.371−0.800−0.714−0.509−1.115−0.741−0.965−0.688
No. 110.3370.1580.0010.0350.2730.1070.2360.239
No. 120.3930.3110.1250.3990.5230.4170.6270.252
No. 13−0.411−0.269−0.150−0.356−0.372−0.671−0.546−0.320
No. 140.145−0.311−0.473−0.0630.620−0.1590.7730.051
No. 152.1982.4562.5242.3402.2562.3072.2982.386
No. 16−0.909−0.457−0.385−0.578−0.586−0.839−0.721−0.644
No. 170.058−0.069−0.503−0.5370.148−0.3740.0660.059
No. 180.7771.5361.3541.2201.7971.0501.7671.523
No. 19−4.364−5.530−5.524−4.879−5.051−3.678−4.940−5.362
No. 201.6771.8182.0251.6281.3731.4331.4271.745
No. 212.7933.4753.2823.1883.5623.0603.5983.508
No. 221.4851.9541.4961.4522.2811.2152.0721.905
No. 23−0.108−0.217−0.242−0.282−0.155−0.325−0.026−0.145
No. 241.9751.9901.8041.8902.2981.8502.1711.956
No. 251.2691.2501.3701.2901.2191.3461.2761.265
No. 26−1.850−2.422−2.200−2.353−2.124−2.563−2.198−2.372
No. 27−0.218−0.387−0.445−0.257−0.4160.029−0.615−0.381
No. 281.4721.4491.4251.3691.5911.8291.5221.505
No. 290.8580.5160.5950.8580.9230.7920.8680.634
No. 300.7141.1570.9730.8191.3330.7091.1571.122
No. 310.1740.1640.1640.1340.282−0.0290.2860.175
Average of all features of men0.5160.5880.5130.4890.6670.4420.6030.598
Average of all features of both sexes0.5090.4860.4480.3960.3680.3190.3910.434
Table 2. Amount of women’s facial feature movements according to the eight emotional stimuli (Yellow filled cells: greater features than the average of the entire features of the both sexes).
Table 2. Amount of women’s facial feature movements according to the eight emotional stimuli (Yellow filled cells: greater features than the average of the entire features of the both sexes).
EmotionAmusementExcitementContentmentAweFearAngerDisgustSad
Feature #
No. 10.4682.1270.8500.638−0.0560.604−0.1220.788
No. 21.1341.1281.1291.2761.0970.7191.7831.672
No. 30.370−0.380−0.540−0.519−0.51−0.172−0.667−0.743
No. 4−1.893−3.316−2.794−2.755−1.723−2.403−2.113−2.590
No. 5−0.3491.225−0.1840.2540.2740.633−0.045−0.310
No. 6−1.6180.269−0.137−0.526−0.440−1.502−0.157−0.014
No. 70.276−2.243−0.830−0.559−0.528−1.1500.170−0.021
No. 80.811−1.004−0.319−0.599−0.751−0.931−1.436−0.993
No. 9−1.499−1.743−0.211−0.297−1.351−1.3140.4450.377
No. 10−1.997−3.523−2.564−2.264−1.660−1.898−1.882−1.935
No. 111.2090.9471.6171.6701.9801.3102.2111.788
No. 121.3573.8705.5964.8082.2084.0753.2494.939
No. 138.4274.6924.1154.1821.9733.7732.7284.156
No. 140.070−0.213−0.287−0.413−0.222−0.231−0.238−0.349
No. 15−0.5630.4410.2140.0861.379−0.2200.6030.222
No. 16−5.222−3.021−3.426−3.642−1.763−2.797−3.471−4.497
No. 17−1.020−1.740−0.185−0.340−0.341−0.7220.037−0.288
No. 181.5411.6471.0030.788−0.0500.9390.4080.636
No. 191.113−0.972−0.079−0.139−0.568−0.300−0.116−0.054
No. 20−1.0581.211−1.026−0.840−0.648−0.404−0.399−0.908
No. 211.1060.5510.0220.046−0.701−0.006−0.405−0.184
No. 224.1923.7903.8113.5273.2472.9122.9093.410
No. 233.3590.5090.3780.128−0.832−0.093−0.8460.502
No. 243.1622.8902.5152.0691.5231.7240.9531.625
No. 25−0.4470.8840.3310.0810.3540.6100.288−0.315
No. 263.8983.9053.9203.9762.6213.6513.6622.785
No. 271.2030.2460.5460.566−0.0410.2750.2960.559
No. 28−0.2931.5721.2421.1190.8020.9881.0241.139
No. 29−1.895−1.247−1.425−1.487−1.497−1.070−1.581−1.210
No. 30−0.359−0.595−1.189−1.241−1.351−0.792−1.550−1.522
No. 310.0860.020−0.194−0.224−0.285−0.117−0.182−0.283
Average of all features of men0.5020.3840.3830.3020.0680.1960.1790.270
Average of all features of both sexes0.5090.4860.4480.3960.3680.3190.3910.434

Share and Cite

MDPI and ACS Style

Suh, K.H.; Kim, Y.; Lee, E.C. Facial Feature Movements Caused by Various Emotions: Differences According to Sex. Symmetry 2016, 8, 86. https://doi.org/10.3390/sym8090086

AMA Style

Suh KH, Kim Y, Lee EC. Facial Feature Movements Caused by Various Emotions: Differences According to Sex. Symmetry. 2016; 8(9):86. https://doi.org/10.3390/sym8090086

Chicago/Turabian Style

Suh, Kun Ha, Yoonkyoung Kim, and Eui Chul Lee. 2016. "Facial Feature Movements Caused by Various Emotions: Differences According to Sex" Symmetry 8, no. 9: 86. https://doi.org/10.3390/sym8090086

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop