Next Article in Journal
Risk and Adversity Factors in Adult Patients with Comorbid Attention Deficit Hyperactivity Disorder (ADHD), Binge Eating Disorder (BED), and Borderline Personality Disorder (BPD): A Naturalistic Exploratory Study
Previous Article in Journal
The First-Night Effect on the Instability of Stage N2: Evidence from the Activity of the Central and Autonomic Nervous Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

“When You’re Smiling”: How Posed Facial Expressions Affect Visual Recognition of Emotions

Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, 41125 Modena, Italy
*
Author to whom correspondence should be addressed.
Brain Sci. 2023, 13(4), 668; https://doi.org/10.3390/brainsci13040668
Submission received: 14 March 2023 / Revised: 6 April 2023 / Accepted: 14 April 2023 / Published: 16 April 2023
(This article belongs to the Section Social Cognitive and Affective Neuroscience)

Abstract

:
Facial imitation occurs automatically during the perception of an emotional facial expression, and preventing it may interfere with the accuracy of emotion recognition. In the present fMRI study, we evaluated the effect of posing a facial expression on the recognition of ambiguous facial expressions. Since facial activity is affected by various factors, such as empathic aptitudes, the Interpersonal Reactivity Index (IRI) questionnaire was administered and scores were correlated with brain activity. Twenty-six healthy female subjects took part in the experiment. The volunteers were asked to pose a facial expression (happy, disgusted, neutral), then to watch an ambiguous emotional face, finally to indicate whether the emotion perceived was happiness or disgust. As stimuli, blends of happy and disgusted faces were used. Behavioral results showed that posing an emotional face increased the percentage of congruence with the perceived emotion. When participants posed a facial expression and perceived a non-congruent emotion, a neural network comprising bilateral anterior insula was activated. Brain activity was also correlated with empathic traits, particularly with empathic concern, fantasy and personal distress. Our findings support the idea that facial mimicry plays a crucial role in identifying emotions, and that empathic emotional abilities can modulate the brain circuits involved in this process.

1. Introduction

The importance of facial expressions: In 1872, Darwin published The Expression of the Emotions in Man and Animals [1]. This pioneering work highlighted the idea that emotions are universal and discrete entities expressed particularly through the face.
In the last two centuries, psychologists and neuroscientists confirmed the central role of facial expression in emotional processing.
Consistent evidence has supported Darwin’s idea of discrete emotional categories characterized by several facial movements that vary to some degree around a typical set of movements. This approach assumes that there is a core facial configuration—the prototype—that can be used to detect the emotional state of an individual. Variations in expressions are ascribed to non-emotional processes such as display rules, emotion-regulation strategies (e.g., suppressing the expression) or culture-specific effects [2,3,4,5,6,7].
The Facial Feedback Hypothesis: The expression and experience of emotion seem to be strictly linked. Once again, the idea was introduced by Darwin, who noted that the experience of an emotion seemed to be intensified when the emotion was freely expressed, and softened when suppressed [1]. This framework is now known as the facial feedback hypothesis or, more recently, embodied emotion [8,9,10]. The latter term refers to the idea that the observed facial expression triggers a simulation of a state in the motor, somatosensory, affective and reward systems, representing the meaning of the expression to the perceiver. The central hypothesis of the embodied emotion theory is that the sensorimotor system is the main contributor to the visual recognition of facial expressions and to other socially relevant tasks, such as action recognition [11,12,13,14], and social interactions including empathy [15,16,17]. The visual perception of an emotional facial expression activates a somatosensory and motor pattern that largely overlap with that subserving the production of the same facial expression [18,19]. This reactivation of a facial expression via a sensorimotor simulation is thought to occur by means of facial mimicry [18,20,21]. Finally, the sensorimotor simulation of facial expressions activates the associated emotional system of the observer, who, experiencing the same emotional state of the other person, uses this information to recognize the facial expression seen [18]. Several studies showed that facial feedback can modulate present emotions and can induce new emotions [22,23,24]. This effect was found for happiness and anger [25,26,27,28], fear and sadness [29] and surprise and disgust [30].
However, a recent meta-analysis [31] found that the effect of facial feedback measured through emotional self-reporting was significant but small, and modulated by several variables. For instance, they showed that facial movements have larger effects on initiating than modulating the emotional status of the subjects, and that presenting emotional audio or imagined scenarios had a greater effect than pictures, video-clips and stories. On the other hand, this revision did not find any effect of the following variables: the discrete versus dimensional measures of emotional experience, the awareness of facial feedback manipulation and of being video-recorded and the gender of the volunteers.
Facial mimicry: Irrespective of the fact that facial feedback can modulate the subjective experience of emotions, facial mimicry, i.e., the tendency to unconsciously and unintentionally imitate the facial expressions of others, is well documented [32,33,34,35]. In most cases, this phenomenon is undetectable to the eye, but it can be evaluated using electromyography (EMG), with, for instance, a similar muscle response to that observed being detected within one second of the facial expression being presented [20,36,37]. For example, an enhanced EMG activity of the zygomaticus major muscle (the muscle that causes the corners of the mouth to rise during smiling) is observed when a person sees a happy face and an increase in the activity of the corrugator supercilii muscles (the muscles that moves the eyebrow down and inward toward the nose and inner eye to frown) in response to an angry expression [32].
Mimicry has been considered a “social glue” [38] because it can generate a feeling of similarity which in turn promotes prosocial behavior [39]. In a recent fMRI study, spontaneous facial mimicry activated the reward neural system, and the magnitude of this effect was positively correlated with trait empathy, thus, emphasizing the “reward value of the act of mimicking”. Other studies confirmed the relationship between facial mimicry and levels of individual empathy, but also between facial mimicry and the susceptibility to emotional contagion [40,41,42,43,44].
Several studies have demonstrated that altering spontaneous facial mimicry affects the recognition of emotions in others [41,45,46,47,48,49], modulates the visual working memory representations of facial expressions [50], activates several cortical areas and modulates the activity of emotional regions such as the insula, anterior cingulate and amygdala [51,52].
Although there is general agreement on the notion that visual and sensorimotor cues provide congruent information in decoding a specific and unique facial expression, there is no agreement on how this simulation process is neurally implemented and to what extent facial mimicry is crucial for emotion recognition.
In the present fMRI study, we evaluated the effect of sensorimotor information on the perception of emotion by asking subjects to pose a facial expression while viewing ambiguous emotional faces. The volunteers were asked to pose a facial expression (happy, disgusted or neutral), watch an image representing a real face expressing an emotion and indicate whether the emotion perceived was happiness or disgust. Ambiguous faces were obtained as a blend of disgusted and happy faces. We also evaluated the effect of empathy on this neural network, as the ability to react emotionally to the emotional expressions is one aspect of empathy. Moreover, within the framework of the facial feedback hypothesis, facial mimicry may be a key for empathy, as the facial muscles function as a feedback system for a person’s own experience of emotion.
Facial mimicry is influenced by various motivational and contextual factors such as individual traits and, specifically, empathic tendencies [18,34,53]. Previous studies have found that emotional empathy, as opposed to cognitive empathy, predicts the extent of spontaneous facial mimicry in response to facial expressions [42,44,54,55]. Consequently, we administered the Interpersonal Reactivity Index (IRI) questionnaire [56], a self-administered questionnaire used to assess emotional and cognitive aspects of empathy independently. Given previous research on facial mimicry [53], we hypothesized that trait empathy could modulate the neural circuit that subserves the emotional processing of facial expression.

2. Materials and Methods

2.1. Participants

Twenty-six right-handed young women (mean age: 23.7; range: 18–39 years; mean school age: 14; range: 13–18) took part in the fMRI study. Handedness was assessed using the Edinburgh Inventory [57] and the participants had no history of neurological or psychiatric diseases. Since previous studies suggested a gender difference in empathy [58,59], only female volunteers were included. The experimental protocol was approved by the local Ethics Committee and all subjects gave their written informed consent to take part in the study.

2.2. Stimuli

The stimuli were ambiguous emotional faces, blends of happy and disgusted faces selected from the Ekman series [60]. Two identities from the series were used. The following blends were used: 50% happy, 50% disgusted; 55% happy, 45% disgusted; 60% happy, 40% disgusted. Neutral faces were also employed for a total of 4 stimuli per identity. Stylized emotional faces (emoticons—happy, H, disgusted, D, neutral, N) were used as a cue for the posed emotional expression to be assumed at the beginning of the trial.

2.3. Procedures

An event-related fMRI paradigm was used. Each subject performed 4 sessions comprising 27 trials each, for a total of 108 trials. Each trial lasted 14 s and began with a 500 ms change of the background color from black to blue (visual warning cue). Then, the volunteers were asked to pose an emotional facial expression (H, D, N) according to a stylized face (emoticon) that appeared on the screen (1.5 s), to keep posing the expression while watching a real human face with an ambiguous facial expression (1 s; total duration of the pose = 2.5 s) and, after a 9 s interval, to indicate whether the emotion expressed by the ambiguous face was happiness (h) or disgust (d; 2 s; see Figure 1). Two passive rest blocks lasting 15 and 24 s were included at the beginning and at the end of each session, respectively. Stimulus presentation was counterbalanced across the four sessions. Participants gave their response by pressing one of two buttons on a response pad that we provided at their right hand. Behavioral responses were collected during the scanning sessions by means of custom-made software developed in Visual Basic 6 (http://digilander.libero.it/marco_serafini/stimoli_video/, accessed on 10 September 2014). The same software was used to present stimuli via the ESys functional MRI System (http://www.invivocorp.com, accessed on 20 September 2014) remote display.
Subjects were not video-recorded during the fMRI scanning sections because of safety issues in the MRI environment. For this reason, we cannot provide an illustration of the emotion posed by the participants. However, the accuracy of the mimicry was controlled via facial electromyography (EMG) recorded using an MRI-compatible EMG recording system (Micromed, Mogliano Veneto, Italy) consisting of three bipolar and one reference electrode. The electrodes (Sintered Detection Cup Electrodes) were positioned in pairs over the corrugator supercilii, levator labii and zygomaticus major muscles [61,62] on the left side of the face. The reference electrode was attached to the left shoulder. Before the electrodes were attached, they were filled with electrode paste and the skin was cleaned with alcohol. The electrode impedance was reduced to 10 kΩ. The digitized EMG signals were transmitted via an optic fiber cable from the high-input impedance amplifier to a computer located outside the scanner room. During fMRI acquisition, a TTL signal was sent every TR (repetition time) via a BNC (Bayonet Neill Concelma), a trigger cable from the MRI console to the EMG computer allowing the synchronization between the acquisition of the functional volumes and the EMG data. The correction of the gradient-echo pulse artifacts was performed offline using the average artefact subtraction (AAS) method [63] implemented in the Brain Quick System Plus software (Micromed, Mogliano Veneto, Italy). The EMG data of each participant were qualitatively analyzed by an expert neurophysiopathology technician to ascertain the congruency between the required facial pose and the muscle activity.
At the end of the scanning session, the volunteers completed the Interpersonal Reactivity Index (IRI) [64,65]. The IRI is a self-report rating index designed to measure personal empathy defined as the “reactions of one individual to the observed experiences of another” [66]. It contains twenty-eight items and four subscales (perspective taking, PT; fantasy, FS; empathic concern, EC; personal distress, PD).

2.4. Behavioral Data Analyses

An arc-sine transformation was run on the percentages of disgust and happiness responses to obtain a normal distribution. A 3 (pose: D, F and N) × 2 (response: h and d) within subjects ANOVA was conducted and a Newman–Keuls post-hoc test was used. Spearman’s rank-order correlation coefficient was calculated to check the correlations between responses and IRI scores.

2.5. fMRI Data Acquisition and Analyses

Functional data were acquired using a Philips Achieva system at 3T and a gradient-echo echo-planar sequence from 30 axial contiguous slices (TR = 2000 ms; in-plane matrix = 64 × 64; voxel size = 3.75 mm × 3.75 mm × 4 mm) over four 651 s sessions per participant. A high-resolution T1-weighted anatomical image was acquired for each participant to allow anatomical localization. The volume consisted of 170 sagittal slices (TR = 9.9 ms; TE = 4.6 ms; in plane matrix = 256 × 256; voxel size = 1 mm × 1 mm × 1 mm).
fMRI analysis was performed using Matlab version R2013a (The MathWorks Inc., Natick, Mass) and the standard SPM12 (Statistical Parametric Mapping, Wellcome Department of Imaging Neuroscience, London, UK) approach. Functional volumes of each participant were slice-time corrected, realigned to the first volume acquired, normalized to the MNI (Montreal Neurologic Institute) template implemented in SPM12 and smoothed with an 8 mm × 8 mm × 8 mm FWHM Gaussian kernel. Due to excessive movement during the scanning, the last two sessions were excluded in three subjects, and the last session was excluded in five subjects.
Functional data of each participant were first analyzed individually and then fed into second-level random effect analyses. By means of the general linear model implemented in SPM12, a 3 × 2 factorial design analysis was performed. The first factor represented the pose (happy, H, disgusted, D, and neutral, N), whereas the second factor was the subject’s response, i.e., happiness (h) or disgust (d). Each condition was modeled by convolving the stimulus onset (ambiguous emotional face) and each motor response with the standard hemodynamic response function (HRF) to create regressors of interest. Motion parameters obtained from the realignment were used as additional regressors of no interest. According to the aim of the study, the linear contrasts of “incongruent (posing disgust and perceived happiness, Dh, and posing happiness and perceiving disgust, Hd) versus neutral (posing neutral and perceived happiness, Nh, and posing neutral and perceived disgust, Nd)” and “congruent (posing disgust and perceived disgust, Dd, and posing happiness and perceiving happiness, Hh) versus neutral” were used to study the effect of pose; the resulting contrast images were entered in the random effects group analyses.
Finally, regression analyses were used to explore which brain regions showed a correlation with individual empathic personality traits assessed using the post-scanning questionnaires.
A family-wise error (FWE) correction was used for the “Incongruent vs. Neutral” and “Congruent vs. Neutral” contrasts, whereas a double statistical threshold (voxel-wise p < 0.001 and spatial extent) was adopted for regression analyses to correct for multiple comparisons; the latter combined significance of α < 0.05 was assessed using the 3dClustSim AFNI routine, using the “-acf” option (https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dClustSim.html, accessed on 28 November 2018).
For all analyses, the Matthew Brett correction (mni2tal: http://www.mrc-cbu.cam.ac.uk/Imaging/mnispace.html, accessed on 28 November 2018) was applied to the SPM-MNI coordinates to obtain the coordinates in Talairach space [67].

3. Results

The qualitative analysis of the EMG data confirmed that the participants actually posed the required emotional facial expressions for each trial.

3.1. Behavioral Data

The ANOVA revealed a significant effects of the factor response (happiness responses were more frequent than disgust ones; F = 70.5; p < 0.001; df = 1; n = 26; power = 1) and of the interaction between the response and pose (F = 6.6; p = 0.001; df = 2; n = 26; power = 0.9; Figure 2). Post-hoc tests showed significant differences between posing disgust and responding with disgust vs. posing disgust and responding with happiness (p = 0.02); and posing happiness and responding happiness vs. posing happiness and responding disgust (p = 0.04).
No significant correlation between behavioral data and the IRI scores (subtest and total scores) was found.

3.2. fMRI Data

3.2.1. Effect of Posed Emotions on Perceived Emotions

The contrast incongruent vs. neutral evaluated areas of significant signal changes for the incongruent conditions (i.e., when subjects posed an expression and perceived a different one), as contrasted with neutral conditions (no pose). Increased activations were detected in a wide range of cortical and subcortical regions, including the bilateral anterior insula (AI), right pre- and postcentral gyri and cerebellum (Table 1 and Figure 3: the activation shown here survived FWE correction). A similar pattern of activation was found for the contrast of congruent vs. neutral, but at a lower, non-significant level.

3.2.2. Correlations with Empathy Subscales

  • Empathic Concern Subscale
A significant positive correlation was found between the activation of the left precuneus and superior parietal lobule and the empathic concern scores when the subjects were posing and perceiving happiness as compared to posing happiness and perceiving disgust (Hh vs. Hd; Table 2 and Figure 4).
  • Fantasy subscale
A positive correlation with the fantasy score was found with the right IFG/AI and caudate nucleus and left temporal cortex (inferior, middle and superior temporal gyri) and supramarginal gyrus when posing disgust and perceiving happiness as compared to posing and perceiving happiness (Dh vs. Hh; Table 3 and Figure 5).
  • Personal distress subscale
The comparison between perceived happiness and perceived disgust, irrespective of the pose (h vs. d), showed a positive correlation between personal distress scores and right superior parietal lobule, angular gyrus, cuneus and precuneus, left pre- and postcentral gyrus, IFG/AI and inferior parietal lobule, bilateral occipital cortex, fusiform gyrus and cerebellum. (Table 4 and Figure 6).
In addition, we found a significant positive correlation with the right lingual gyrus and bilateral cerebellum when posing with a neutral expression and perceiving happiness compared to posing with a neutral expression and perceiving disgust (Nh vs. Nd) (Table 5; Figure 7).
No significant correlation was found for the PT subscale.

4. Discussion

In the present event-related fMRI study, we evaluated whether posing facial expressions can support automatic emotion recognition. We asked participants to pose a facial expression (happy, disgusted or neutral) while judging the emotion of visually presented ambiguous faces. Our behavioral results showed that posing an emotion shifts the visual perception of ambiguous expressions towards that same emotion: posing a disgusted face increased the proportion of disgust responses, whereas posing a happy face increased the proportion of happiness responses.
Several studies demonstrated that humans usually voluntarily imitate or unconsciously match the nonverbal behaviors of others: these phenomena are called mimicry. Mimicry has been demonstrated to happen in response to facial expressions [32], body movements [68] or even pupil dilations [69]. Facial mimicry is often imperceptible and can be detected only using specialized and sensitive methods that can measure contractions of the facial muscles accurately. Using electromyography (EMG), muscle reactions matching observed facial or bodily expressions can be detected within a second of exposure. For instance, observing positive emotional facial expressions can lead to heightened muscle activity in the zygomatic major, while negative expressions can activate the corrugator supercilii within 500 milliseconds of the presentation of the stimulus [32,61,70].
Facial expression recognition is supported by visual expertise that is partially responsible for the capability to extract information from faces [71]. In addition, the sensorimotor simulation employed by reproducing the motor movements of the observed facial expressions can facilitate the recognition of the emotional expression. This motor activity, which is typically unconscious, is thought to elicit partial activation in the neural circuits involved in experiencing the corresponding emotion. The simulator can then implicitly deduce the internal state of the person displaying the expression. In support of this hypothesis, few studies have evaluated the impact of facial movement on the ability to recognize emotions, particularly from facial expressions [18]. For instance, Ponari et al. [47] developed a series of experiments in which they manipulated the participants’ ability to move either the upper or lower half of their face. Their results indicated that the “lower” manipulation specifically impaired recognition of happiness and disgust, while the “upper” manipulation hindered recognition of anger, and both manipulations affected the recognition of fear. Wood et al. [18] prevented participants from producing facial movements by applying a gel facemask, and asked subjects to distinguish between target facial expressions and similar-looking distractors. They found that the participants’ ability to recognize emotions was impaired, indicating that the sensory and motor processes linked to expression imitation contribute to the visual perceptual processing of facial expressions. More recently, Borgomaneri et al. [41] evaluated whether inhibiting mimicry affects the recognition of happiness conveyed through facial or body expressions. Their results showed that blocking mimicry on the lower face affected the recognition of happy facial and body expressions, while the recognition of neutral and fearful expressions was unaffected by the experimental manipulation.
Taken together, these findings support the role of facial mimicry in emotion recognition, and suggest that facial mimicry may reflect a comprehensive sensorimotor simulation of others’ emotions, rather than just a muscle-specific replication of an observed motor expression.
Our functional results showed that when participants posed a facial expression and perceived a non-congruent emotion (for instance, posing disgust and perceived happiness, or posing happiness and perceiving disgust) a neural network comprising the bilateral anterior insula, motor cortex, cerebellum and superior temporal gyri was activated. This pattern of activation was also present in the case of congruency between the pose and perception, but at a lower, non-significant level.
A similar pattern of fMRI signal changes was detected by Braadbaart et al. [72]. In this study, participants were instructed to either imitate (match) or perform mismatched facial movements in response to blends of facial emotional expressions. Their results showed greater neural activity during the execution of mismatched actions as compared to imitation, especially in the insula bilaterally.
The involvement of the anterior insula could be ascribed to its known role in detecting conflict between action and response, as described by Ullsperger et al. [73]. Our results supported their hypothesis that this involvement is not surprising, given the insula’s role in both the visual perception of facial expressions and the self-expression of the same emotion [74,75]. It is reasonable to expect that the insula’s role in error monitoring would result in stronger responses to facial stimuli that do not match the actions being performed, leading to greater sensitivity to incongruency between the pose and response. The activation of the motor cortex and the cerebellum could be ascribed to the motor component of the imitation task.
Our functional results also showed that some brain regions exhibit correlations with the individual empathic disposition, as tested using the different IRI subscales.
A previous study demonstrated that empathic disposition could modulate facial feedback in response to emotional expressions. Some studies have shown a correlation between levels of individual empathy or susceptibility to emotional contagion and facial mimicry [44,53,54], although the strength of this relationship remains uncertain.
Williams et al. [76] proposed that imitation is the link between facial mimicry and empathic abilities. The authors argued that while facial mimicry may utilize a primary sensorimotor representation [77,78,79], imitation may require a secondary representation of the intention and the motor plan for that action [80]. Facial imitation may, therefore, involve mechanisms similar to those involved in empathy; indeed, empathy deals with communicating emotions and needs a secondary representation of those emotions. In turn, this representation enables emotion understanding [81]. This hypothesis is linked to the simulation theory of empathy, which suggests that empathizers use their neural systems to imitate actions “offline” in order to imagine and understand the experiences of others [82]. Moreover, this hypothesis is in line with the perception–action model of empathy [83], which suggests that empathy depends on the perception–action coupling mechanisms necessary for imitation. Evidence for a strict link between empathic traits and imitation abilities comes from autism studies that suggested that, in these patients, an impairment in empathy and imitation co-occurs [84]. Moreover, a limited range of facial expressions is considered to have diagnostic value for autism [85]. In support of this view, Williams et al. [76] developed an imitation paradigm in which participants were asked to imitate expressions of faces representing a blend of emotions. The accuracy of imitation was rated by two experimenters and correlated with the empathy quotient (EQ) score. The EQ is a 60-item self-report questionnaire that measures individual differences in empathic ability [81]. The results showed that participants who scored a higher EQ exhibited superior facial imitation skills, especially when imitating more complex stimuli.
The link between empathic abilities and facial mimicry could be the facilitation effect of facial imitation in understanding others’ emotions. As already reported above, Borgomaneri et al. [41] recently showed that blocking mimicry on the lower face impaired the recognition of happy facial and body expressions. Furthermore, this impairment was correlated with empathic traits. Specifically, the index of the drop in accuracy of emotion recognition was significantly correlated with the empathic concern (EC) IRI subscale score, that is, individuals with lower levels of emotional empathy were significantly impaired when mimicry was blocked, whereas those with higher levels of emotional empathy showed little or no interference.
Our functional results showed a significant correlation between EC scores and the activation of the left precuneus and superior parietal lobule when congruent facial mimicry facilitated a happiness response (i.e., when the subjects were posing happiness and perceiving happiness, as opposed to posing happiness and perceiving disgust). These brain regions are the functional core of the Default Mode Network (DMN), reflecting self-referential processes that are active during the resting state.
The DMN plays a role in evaluating survival-relevant information from the body and the world [86]. This includes subsuming the other’s point of view, desires, beliefs and intentions, as well as remembering the past and planning for the future [87]. These functions are all self-referential in nature. When engaged in a cognitive processing effort, there is a reduction in activity in the DMN [86,88]. This reduction can be interpreted as an adaptive mechanism to reduce self-referential activity in the brain and improve focus on the task. Failure to do so may result in interference from internal emotional states during task performance, as observed in patients with depression [86]. More recently, precuneus activity was found to correlate with the score of subjective happiness (SHS). In particular, trainees of mindfulness meditation [89], which reportedly heightens SHS scores [90] and increases the gray matter volume in the precuneus [91], showed reduced resting state precuneus activity when compared with participants who did not follow the training [92]. In patients with depression, who exhibit low SHS scores [93] and a reduced gray matter volume in the precuneus [94], resting-state blood flow in this region was increased during depressive episodes [95,96] and was decreased after a significant improvement in their mental health [97].
The sensorimotor simulation of others’ emotions in subjects with higher emotional empathic abilities is correlated with the activity of the precuneus and could reflect their higher capacity to emotionally empathize with other people’s state of mind and to “resonate” with their positive feelings.
We also found a positive correlation with the fantasy score, in particular with the activity of the right IFG/AI and caudate nucleus, left temporal cortex (inferior, middle and superior temporal gyri) and supramarginal gyrus when posing a disgusted expression and perceiving happiness compared to posing and perceiving happiness. These areas, particularly the IFG/AI and caudate nucleus, are consistently activated during disgust processing [75] and could reflect the ability of people with a high capacity for imagination to feel disgust as if they were actually experiencing it.
Finally, we found a positive correlation between the activity of several posterior areas, the anterior insula and sensorimotor areas and the personal distress score when the participants perceived happiness compared to when they perceived disgust. PD measures the tendency to react with a negative emotional self-focused response to the apprehension or comprehension of another’s emotional state or condition. These activations may be considered in line with other studies, e.g., the study by Llamas-Alonso et al. [98], who detected higher levels of activity in the occipital regions with regards to happy faces than angry faces during a pro-saccade task; and a study by Loi et al. [99], who, in an EEG study, found that the view of happy, but not sad, faces increased the excitability in face M1 bilaterally. Furthermore, recent research [100] related levels of subjective well-being with increased gray matter volumes of the anterior insula. In addition, Luo et al. [101] investigated the correlation of DMN activity with subjective happiness and greater connectivity was found in several cortical areas, including the inferior parietal lobule in people with a high level of subjective unhappiness. These results were interpreted as suggesting that this abnormal activity may indicate excessive negative self-reflection. However, in our protocol, activity in these areas was correlated with the PD score. We speculate that in subjects with higher aptitude to personal distress, these regions are even more related to the perception of positive emotions (happiness) compared to the general population.

5. Conclusions

To the best of our knowledge, this is the first study to investigate emotional face processing under ambiguous conditions and personality traits, namely, empathic aptitudes. Our behavioral results revealed that posing an emotion shifts the visual perception of ambiguous expressions towards that same emotion. Our functional results showed that the brain activity underlying this processing is modulated by individual emotional empathic disposition, as tested using the different IRI subscales. In particular, we found a significant correlation between empathic concern (EC) scores and the activation of regions that represent the functional core of the Default Mode Network, reflecting self-referential processes that are active during the resting state. We also found a positive correlation between the fantasy (FS) score and the activity of areas known to be correlated with disgust processing [75], which could reflect the ability of people with a high capacity for imagination to feel disgust as if they were actually experiencing it. Finally, we found a positive correlation between the personal distress (PD) score and the activity of several posterior areas, the anterior insula and sensorimotor areas. PD measures the tendency to react with a negative emotional self-focused response to the apprehension or comprehension of another’s emotional state or condition. According to the literature, we speculate that in subjects with a higher aptitude to personal distress, these regions are even more related to the perception of positive emotions (happiness) compared to the general population.
Based on our results, we suggest that under ambiguous conditions, the prevalence of bottom–up sensory stimulation or top–down motor priming is determined by individual characteristics.

Author Contributions

Conceptualization F.B., P.F.N. and F.L.; investigation, F.B. and D.B.; methodology, data curation and formal analysis, F.B., D.B., C.C. and V.Z.; writing original draft, F.B.; writing—review & editing, F.B., C.A.P., P.F.N., F.L., D.B., C.C. and V.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Local Ethics Committee (Comitato Etico dell’Area Vasta Emilia Nord; protocol code 134/14, 15 July 2014).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to the privacy policy.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

fMRIFunctional Magnetic Resonance
IRIInterpersonal Reactivity Index
   PTPerspective Taking
   FSFantasy
   ECEmpathic Concern
   PDPersonal Distress
EMGElectromyography
emoticons (posed emotion):
HHappy
DDisgusted
NNeutral
perceived emotions:
hHappiness
dDisgust
TRRepetition time
BNCBayonet Neill Concelma
AASAverage artefact subtraction
TEEcho Time
SPMStatistical Parametric Mapping
FWEFamily Wise Error
AIAnterior insula
IFGInferior Frontal Gyrus
BABrodmann area
lLeft
rRight

References

  1. Darwin, C. The Expression of Emotions in Man and Animals; Oxford University Press: London, UK, 1872. [Google Scholar]
  2. Ekman, P.; Cordaro, D. What is meant by calling emotions basic. Emot. Rev. 2011, 3, 364–370. [Google Scholar] [CrossRef]
  3. Elfenbein, H.A. Nonverbal dialects and accents in facial expressions of emotion. Emot. Rev. 2013, 5, 90–96. [Google Scholar] [CrossRef]
  4. Elfenbein, H.A. Emotional dialects in the language of emotion. In The Science of Facial Expression; Fernandez-Dols, J.-M., Russell, J.A., Eds.; Oxford University Press: New York, NY, USA, 2017; pp. 479–496. [Google Scholar]
  5. Matsumoto, D. Cultural similarities and differences in display rules. Motiv. Emot. 1990, 14, 195–214. [Google Scholar] [CrossRef]
  6. Matsumoto, D.; Keltner, D.; Shiota, M.N.; O’Sullivan, M.; Frank, M. Facial expressions of emotion. In Handbook of Emotions, 3rd ed.; Lewis, M., Haviland-Jones, J.M., Barrett, L.F., Eds.; The Guilford Press: New York, NY, USA, 2008; pp. 211–234. [Google Scholar]
  7. Tracy, J.L.; Randles, D. Four models of basic emotions: A review of Ekman and Cordaro, Izard, Levenson, and Panksepp and Watt. Emot. Rev. 2011, 3, 397–405. [Google Scholar] [CrossRef]
  8. Niedenthal, P.M. Embodying emotion. Science 2007, 316, 1002–1005. [Google Scholar] [CrossRef] [PubMed]
  9. Niedenthal, P.M.; Mondillon, L.; Winkielman, P.; Vermeulen, N. Embodiment of emotion concepts. J. Pers. Soc. Psychol. 2009, 96, 1120–1136. [Google Scholar] [CrossRef]
  10. Niedenthal, P.M.; Mermillod, M.; Maringer, M.; Hess, U. The Simulation of Smiles (SIMS) model: Embodied simulation and the meaning of facial expression. Behav. Brain Sci. 2010, 33, 417–433. [Google Scholar] [CrossRef] [PubMed]
  11. Friston, K. The free-energy principle: A unified brain theory? Nat. Rev. Neurosci. 2010, 11, 127–138. [Google Scholar] [CrossRef]
  12. Kilner, J.M.; Friston, K.J.; Frith, C.D. Predictive coding: An account of the mirror neuron system. Cogn. Process. 2007, 8, 159–166. [Google Scholar] [CrossRef]
  13. Rizzolatti, G.; Fogassi, L.; Gallese, V. Neurophysiological mechanisms underlying the understanding and imitation of action. Nat. Rev. Neurosci. 2001, 2, 661–670. [Google Scholar] [CrossRef]
  14. Rizzolatti, G.; Sinigaglia, C. The functional role of the parieto-frontal mirror circuit: Interpretations and misinterpretations. Nat. Rev. Neurosci. 2010, 11, 264–274. [Google Scholar] [CrossRef]
  15. Iacoboni, M. Imitation, empathy, and mirror neurons. Annu. Rev. Psychol. 2009, 60, 653–670. [Google Scholar] [CrossRef]
  16. Tramacere, A.; Ferrari, P.F. Faces in the mirror, from the neuroscience of mimicry to the emergence of mentalizing. J. Anthr. Sci. 2016, 94, 113–126. [Google Scholar]
  17. Benuzzi, F.; Lui, F.; Ardizzi, M.; Ambrosecchia, M.; Ballotta, D.; Righi, S.; Pagnoni, G.; Gallese, V.; Porro, C.A. Pain Mirrors: Neural Correlates of Observing Self or Others’ Facial Expressions of Pain. Front. Psychol. 2018, 9, 1825. [Google Scholar] [CrossRef]
  18. Wood, A.; Rychlowska, M.; Korb, S.; Niedenthal, P. Fashioning the face: Sensorimotor simulation contributes to facial expression recognition. Trends Cogn. Sci. 2016, 20, 227–240. [Google Scholar] [CrossRef]
  19. Goldman, A.I.; Sripada, C.S. Simulationist models of face-based emotion recognition. Cognition 2005, 94, 193–213. [Google Scholar] [CrossRef] [PubMed]
  20. Korb, S.; Grandjean, D.; Scherer, K.R. Timing and voluntary suppression of facial mimicry to smiling faces in a go/nogo task—An EMG study. Biol. Psychol. 2010, 85, 347–349. [Google Scholar] [CrossRef] [PubMed]
  21. Krumhuber, E.G.; Likowski, K.U.; Weyers, P. Facial mimicry of spontaneous and deliberate Duchenne and non-Duchenne smiles. J. Nonverbal Behav. 2014, 38, 1–11. [Google Scholar] [CrossRef]
  22. Adelmann, P.K.; Zajonc, R.B. Facial efference and the experience of emotion. Ann. Rev. Psychol. 1989, 40, 249–280. [Google Scholar] [CrossRef] [PubMed]
  23. Buck, R. Nonverbal behavior and the theory of emotion: The facial feedback hypothesis. J. Personal. Soc. Psychol. 1980, 38, 811–824. [Google Scholar] [CrossRef]
  24. McIntosh, D.N. Facial feedback hypothesis: Evidence, implications, and directions. Motiv. Emot. 1996, 20, 121–147. [Google Scholar] [CrossRef]
  25. Dimberg, U.; Söderkvist, S. The voluntary facial action technique: A method to test the facial feedback hypothesis. J. Nonverbal Behav. 2011, 35, 17–33. [Google Scholar] [CrossRef]
  26. Söderkvist, S.; Ohlén, K.; Dimberg, U. How the Experience of Emotion is Modulated by Facial Feedback. J. Nonverbal Behav. 2018, 42, 129–151. [Google Scholar] [CrossRef]
  27. Laird, J.D. Self-attribution of emotion: The effects of expressive behavior on the quality of emotional experience. J. Personal. Soc. Psychol. 1974, 29, 475–486. [Google Scholar] [CrossRef] [PubMed]
  28. Rutledge, L.L.; Hupka, R.B. The facial feedback hypothesis: Methodological concerns and new supporting evidence. Motiv. Emot. 1985, 9, 219–240. [Google Scholar] [CrossRef]
  29. Flack, W.F.; Laird, J.D.; Cavallaro, L.A. Separate and combined effects of facial expressions and bodily postures on emotional feelings. Eur. J. Soc. Psychol. 1999, 29, 203–217. [Google Scholar] [CrossRef]
  30. Lewis, M.B. Exploring the positive and negative implications of facial feedback. Emotion 2012, 12, 852–859. [Google Scholar] [CrossRef]
  31. Coles, N.A.; Larsen, J.T.; Lench, H.C. A meta-analysis of the facial feedback literature: Effects of facial feedback on emotional experience are small and variable. Psychol. Bull. 2019, 145, 610–651. [Google Scholar] [CrossRef] [PubMed]
  32. Dimberg, U.; Thunberg, M. Rapid facial reactions to emotional facial expressions. Scand. J. Psychol. 1998, 39, 39–45. [Google Scholar] [CrossRef]
  33. Dimberg, U.; Thunberg, M.; Elmehed, K. Unconscious facial reactions to emotional facial expressions. Psychol. Sci. 2000, 11, 86–89. [Google Scholar] [CrossRef] [PubMed]
  34. Hess, U.; Fischer, A. Emotional mimicry as social regulation. Personal. Soc. Psychol. Rev. 2013, 17, 142–157. [Google Scholar] [CrossRef]
  35. Moody, E.J.; McIntosh, D.N.; Mann, L.J.; Weisser, K.R. More than mere mimicry? The influence of emotion on rapid facial reactions to faces. Emotion 2007, 7, 447–457. [Google Scholar] [CrossRef] [PubMed]
  36. Achaibou, A.; Pourtois, G.; Schwartz, S.; Vuilleumier, P. Simultaneous recording of EEG and facial muscle reactions during spontaneous emotional mimicry. Neuropsychologia 2008, 46, 1104–1113. [Google Scholar] [CrossRef] [PubMed]
  37. Dimberg, U.; Petterson, M. Facial reactions to happy and angry facial expressions: Evidence for right hemisphere dominance. Psychophysiology 2000, 37, 693–696. [Google Scholar] [CrossRef] [PubMed]
  38. Lakin, J.L.; Jefferis, V.E.; Cheng, C.M.; Chartrand, T.L. The Chameleon effect as social glue: Evidence for the evolutionary significance of nonconscious mimicry. J. Nonverbal Behav. 2003, 27, 145–162. [Google Scholar] [CrossRef]
  39. Kulesza, W.; Dolinski, D.; Huisman, A.; Majewski, R. The Echo Effect: The power of verbal mimicry to influence prosocial behavior. J. Lang Soc. Psychol. 2014, 33, 183–201. [Google Scholar] [CrossRef]
  40. Bos, P.A.; Jap-tjong, N.; Spencer, H.; Hofman, D. Social context modulates facial imitation of children’s emotional expressions. PLoS ONE 2016, 11, e0167991. [Google Scholar] [CrossRef]
  41. Borgomaneri, S.; Bolloni, C.; Sessa, P.; Avenanti, A. Blocking facial mimicry affects recognition of facial and body expressions. PLoS ONE 2020, 15, e0229364. [Google Scholar] [CrossRef]
  42. Dimberg, U.; Andreasson, P.; Thunberg, M. Emotional empathy and facial reactions to facial expressions. J. Psychophysiol. 2011, 25, 26–31. [Google Scholar] [CrossRef]
  43. Prochazkova, E.; Kret, M.E. Connecting minds and sharing emotions through mimicry: A neurocognitive model of emotional contagion. Neurosci. Biobehav. Rev. 2017, 80, 99–114. [Google Scholar] [CrossRef]
  44. Sonnby-Borgström, M. Automatic mimicry reactions as related to differences in emotional empathy. Scand. J. Psychol. 2002, 43, 433–443. [Google Scholar] [CrossRef]
  45. Neal, D.T.; Chartrand, T.L. Embodied emotion perception: Amplifying and dampening facial feedback modulates emotion perception accuracy. Soc. Psychol. Personal. Sci. 2011, 2, 673–678. [Google Scholar] [CrossRef]
  46. Oberman, L.M.; Winkielman, P.; Ramachandran, V.S. Face to face: Blocking facial mimicry can selectively impair recognition of emotional expressions. Soc. Neurosci. 2007, 2, 167–178. [Google Scholar] [CrossRef] [PubMed]
  47. Ponari, M.; Conson, M.; D’Amico, N.P.; Grossi, D.; Troiano, L. Mapping Correspondence Between Facial Mimicry and Emotion Recognition in Healthy Subjects. Emotion 2012, 12, 1398–1403. [Google Scholar] [CrossRef] [PubMed]
  48. Ipser, A.; Cook, R. Blocking facial mimicry reduces perceptual sensitivity for facial expressions. J. Vis. 2015, 15, 1376. [Google Scholar] [CrossRef]
  49. Stel, M.; van Knippenberg, A. The role of facial mimicry in the recognition of affect. Psychol. Sci. 2008, 19, 984–985. [Google Scholar] [CrossRef] [PubMed]
  50. Sessa, P.; Schiano Lomoriello, A.; Luria, R. Neural measures of the causal role of observers’ facial mimicry on visual working memory for facial expressions. Soc. Cogn. Affect. Neurosci. 2018, 13, 1281–1291. [Google Scholar] [CrossRef]
  51. Lee, T.; Josephs, O.; Dolan, R.J.; Critchley, H.D. Imitating expressions: Emotion-specific neural substrates in facial mimicry. Soc. Cogn. Affect. Neurosci. 2006, 1, 122–135. [Google Scholar] [CrossRef] [PubMed]
  52. Hennenlotter, A.; Dresel, C.; Castrop, F.; Ceballos-Baumann, A.O.; Wohlschläger, A.M.; Haslinger, B. The Link between facial feedback and neural activity within central circuitries of emotion—New insights from botulinum toxin–induced denervation of frown muscles. Cereb. Cortex 2008, 19, 537–542. [Google Scholar] [CrossRef] [PubMed]
  53. Seibt, B.; Mühlberger, A.; Likowski, K.U.; Weyers, P. Facial mimicry in its social setting. Front. Psychol. 2015, 6, 1122. [Google Scholar] [CrossRef]
  54. Balconi, M.; Canavesio, Y. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing. Cogn. Emot. 2016, 30, 210–224. [Google Scholar] [CrossRef] [PubMed]
  55. Harrison, N.A.; Morgan, R.; Critchley, H.D. From facial mimicry to emotional empathy: A role for norepinephrine? Soc. Neurosci. 2010, 5, 393–400. [Google Scholar] [CrossRef] [PubMed]
  56. Davis, M.H. Empathy: A Social Psychological Approach; Westview Press: Boulder, CO, USA, 1996. [Google Scholar]
  57. Oldfield, R.C. The assessment and analysis of handedness: The Edinburgh inventory. Neuropsychologia 1971, 9, 97–113. [Google Scholar] [CrossRef] [PubMed]
  58. Decety, J.; Jackson, P.L. The functional architecture of human empathy. Behav. Cogn. Neurosci. Rev. 2004, 3, 71–100. [Google Scholar] [CrossRef]
  59. Singer, T.; Seymour, B.; O’Doherty, J.P.; Stephan, K.E.; Dolan, R.J.; Frith, C.D. Empathic neural responses are modulated by the perceived fairness of others. Nature 2006, 439, 466–469. [Google Scholar] [CrossRef] [PubMed]
  60. Ekman, P.; Friesen, W.V. Pictures of Facial Affect; Consulting Psychologists Press: Palo Alto, CA, USA, 1976. [Google Scholar]
  61. Cacioppo, J.T.; Petty, R.E.; Losch, M.E.; Kim, H.S. Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. J. Personal. Soc. Psychol. 1986, 50, 260. [Google Scholar] [CrossRef]
  62. Fridlund, A.J.; Cacioppo, J.T. Guidelines for human electromyographic research. Psychophysiology 1986, 23, 567–589. [Google Scholar] [CrossRef]
  63. Allen, P.J.; Josephs, O.; Turner, R. A method for removing imaging artifact from continuous EEG recorded during functional MRI. Neuroimage 2000, 12, 230–239. [Google Scholar] [CrossRef]
  64. Davis, M.H. A multidimensional approach to individual differences in empathy. JSAS Catal. Sel. Docs. Psychol. 1980, 10, 85. [Google Scholar]
  65. Albiero, P.; Ingoglia, S.; Lo Coco, A. Contributo all’adattamento italiano dell’Interpersonal Reactivity Index. Test.-Psicometria-Metodol. 2006, 13, 107–125. [Google Scholar]
  66. Davis, M.H. Measuring individual differences in empathy: Evidence for a multidimensional approach. J. Personal. Soc. Psychol. 1983, 44, 113–126. [Google Scholar] [CrossRef]
  67. Talairach, J.; Tournoux, P. Co-Planar Stereotaxic Atlas of the Human Brain; Thieme Medical Publisher: New York, NY, USA, 1988. [Google Scholar]
  68. Chartrand, T.L.; Bargh, J.A. The chameleon effect: The perception—Behavior link and social interaction. J. Personal. Soc. Psychol. 1999, 76, 893–910. [Google Scholar] [CrossRef] [PubMed]
  69. Kret, M.E.; Fischer, A.H.; De Dreu, C.K.W. Pupil mimicry correlates with trust in in-group partners with dilating pupils. Psychol. Sci. 2015, 26, 1401–1410. [Google Scholar] [CrossRef]
  70. Dimberg, U. Facial electromyographic reactions and autonomic activity to auditory stimuli. Biol. Psychol. 1990, 31, 137–147. [Google Scholar] [CrossRef] [PubMed]
  71. Haxby, J.V.; Hoffman, E.A.; Gobbini, M.I. The distributed human neural system for face perception. Trends Cogn. Sci. 2000, 4, 223–233. [Google Scholar] [CrossRef]
  72. Braadbaart, L.; de Grauw, H.; Perrett, D.I.; Waiter, G.D.; Williams, J.H. The shared neural bases of empathy and facial imitation accuracy. Neuroimage 2014, 84, 367–375. [Google Scholar] [CrossRef]
  73. Ullsperger, M.; Harsay, H.A.; Wessel, J.R.; Ridderinkhof, K.R. Conscious perception of errors and its relation to the anterior insula. Anat. Embryol. 2010, 214, 629–643. [Google Scholar] [CrossRef]
  74. Carr, L.; Iacoboni, M.; Dubeau, M.; Mazziotta, J.C.; Lenzi, G.L. Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas. Proc. Natl. Acad. Sci. USA 2003, 100, 5497–5502. [Google Scholar] [CrossRef] [PubMed]
  75. Wicker, B.; Keysers, C.; Plailly, J.; Royet, J.; Gallese, V.; Rizzolatti, G. Both of us disgusted in my insula: The common neural basis of seeing and feeling disgust. Neuron 2003, 40, 655–664. [Google Scholar] [CrossRef]
  76. Williams, J.H.G.; Nicolson, A.T.A.; Clephan, K.J.; De Grauw, H.; Perrett, D. A novel method testing the ability to imitate composite emotional expressions reveals an association with empathy. PLoS ONE 2013, 8, e61941. [Google Scholar] [CrossRef]
  77. Piaget, J. Play, Dreams, and Imitation in Childhood; Routledge and Kegan Paul: London, UK, 1951. [Google Scholar]
  78. Suddendorf, T.; Whiten, A. Mental evolution and development: Evidence for secondary representation in children, great apes and other animals. Psychol. Bull. 2001, 127, 629–650. [Google Scholar] [CrossRef] [PubMed]
  79. Damasio, A.; Carvalho, G.B. The nature of feelings: Evolutionary and neurobiological origins. Nat. Rev. Neurosci. 2013, 14, 143–152. [Google Scholar] [CrossRef]
  80. Wolpert, D.M.; Doya, K.; Kawato, M. A unifying computational framework for motor control and social interaction. Philos. Trans. R. Soc. B Biol. Sci. 2003, 358, 593–602. [Google Scholar] [CrossRef] [PubMed]
  81. Baron-Cohen, S.; Wheelwright, S. The empathy quotient: An investigation of adults with Asperger syndrome or high functioning autism, and normal sex differences. J. Autism Dev. Disord. 2004, 34, 163–175. [Google Scholar] [CrossRef] [PubMed]
  82. Gallese, V.; Goldman, A. Mirror neurons and the simulation theory of mind-reading. Trends Cogn. Sci. 1998, 2, 493–501. [Google Scholar] [CrossRef]
  83. Preston, S.D.; de Waal, F.B. Empathy: Its ultimate and proximate bases. Behav. Brain Sci. 2002, 25, 1–20. [Google Scholar] [CrossRef]
  84. Williams, J.H.; Whiten, A.; Suddendorf, T.; Perrett, D.I. Imitation, mirror neurons and autism. Neurosci. Biobehav. Rev. 2001, 25, 287–295. [Google Scholar] [CrossRef]
  85. Lord, C.; Risi, S.; Lambrecht, L.; Cook, E.H., Jr.; Leventhal, B.L.; Di Lavore, P.C.; Pickles, A.; Rutter, M. The autism diagnostic observation schedule-generic: A standard measure of social and communication deficits associated with the spectrum of autism. J Autism Dev. Disord. 2000, 30, 205–223. [Google Scholar] [CrossRef]
  86. Buckner, R.; Andrews-Hanna, J.; Schacter, D. The brain’s default network: Anatomy, function, and relevance to disease. Ann. N. Y. Acad. Sci. 2008, 1124, 1–38. [Google Scholar] [CrossRef]
  87. Raichle, M.; MacLeod, A.M.; Snyder, A.Z.; Powers, W.J.; Gusnard, D.A.; Shulman, G.L. A default mode of brain function. Proc. Natl. Acad. Sci. USA 2001, 98, 676–682. [Google Scholar] [CrossRef]
  88. Gusnard, D.; Akbudak, E.; Shulman, G.; Raichle, M. Medial prefrontal cortex and self-referential mental activity: Relation to a default mode of brain function. Proc. Natl. Acad. Sci. USA 2001, 98, 4259–4264. [Google Scholar] [CrossRef] [PubMed]
  89. Kabat-Zinn, J. Full Catastrophe Living: How to Cope with Stress, Pain and Illness Using Mindfulness Meditation; Delacorte: New York, NY, USA, 1990. [Google Scholar]
  90. O’Leary, K.; Dockray, S. The effects of two novel gratitude and mindfulness interventions on well-being. J. Altern. Complement. Med. 2015, 21, 243–245. [Google Scholar] [CrossRef] [PubMed]
  91. Kurth, F.; Luders, E.; Wu, B.; Black, D.S. Brain gray matter changes associated with mindfulness meditation in older adults: An exploratory pilot study using voxel-based morphometry. Neuro 2014, 1, 23–26. [Google Scholar] [PubMed]
  92. Way, B.M.; Creswell, J.D.; Eisenberger, N.I.; Lieberman, M.D. Dispositional mindfulness and depressive symptomatology: Correlations with limbic and self-referential neural activity during rest. Emotion 2010, 10, 12–24. [Google Scholar] [CrossRef]
  93. Nan, H.; Ni, M.Y.; Lee, P.H.; Tam, W.W.S.; Lam, T.H.; Leung, G.M.; McDowell, I. Psychometric evaluation of the Chinese version of the Subjective Happiness Scale: Evidence from the Hong Kong FAMILY cohort. Int. J. Behav. Med. 2014, 21, 646–652. [Google Scholar] [CrossRef]
  94. Shen, Z.; Cheng, Y.; Yang, S.; Dai, N.; Ye, J.; Liu, X.; Lu, J.; Li, N.; Liu, F.; Lu, Y.; et al. Changes of grey matter volume in first-episode drug-naive adult major depressive disorder patients with different age-onset. NeuroImage Clin. 2016, 12, 492–498. [Google Scholar] [CrossRef] [PubMed]
  95. Jing, B.; Liu, C.H.; Ma, X.; Yan, H.G.; Zhuo, Z.Z.; Zhang, Y.; Wang, S.H.; Li, H.Y.; Wang, C.Y. Difference in amplitude of low-frequency fluctuation between currently depressed and remitted females with major depressive disorder. Brain Res. 2013, 1540, 74–83. [Google Scholar] [CrossRef]
  96. Wei, X.; Shen, H.; Ren, J.; Liu, W.; Yang, R.; Liu, J.; Wu, H.; Xu, X.; Lai, L.; Hu, J.; et al. Alteration of spontaneous neuronal activity in young adults with non-clinical depressive symptoms. Psychiatry Res. 2015, 233, 36–42. [Google Scholar] [CrossRef]
  97. Dumas, R.; Richieri, R.; Guedj, E.; Auquier, P.; Lancon, C.; Boyer, L. Improvement of health-related quality of life in depression after transcranial magnetic stimulation in a naturalistic trial is associated with decreased perfusion in precuneus. Health Qual. Life Outcomes 2012, 10, 87. [Google Scholar] [CrossRef]
  98. Llamas-Alonso, L.A.; Barrios, F.A.; González-Garrido, A.A.; Ramos-Loyo, J. Emotional faces interfere with saccadic inhibition and attention re-orientation: An fMRI study. Neuropsychologia 2022, 173, 108300. [Google Scholar] [CrossRef]
  99. Loi, N.; Ginatempo, F.; Manca, A.; Melis, F.; Deriu, F. Faces emotional expressions: From perceptive to motor areas in aged and young subjects. J. Neurophysiol. 2021, 126, 1642–1652. [Google Scholar] [CrossRef] [PubMed]
  100. Jung, H.Y.; Pae, C.; An, I.; Bang, M.; Choi, T.K.; Cho, S.J.; Lee, S.H. A multimodal study regarding neural correlates of the subjective well-being in healthy individuals. Sci. Rep. 2022, 12, 13688. [Google Scholar] [CrossRef] [PubMed]
  101. Luo, Y.; Kong, F.; Qi, S.; You, X.; Huang, X. Resting-state functional connectivity of the default mode network associated with happiness. Soc. Cogn. Affect. Neurosci. 2016, 11, 516–524. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Experimental protocol for the fMRI session.
Figure 1. Experimental protocol for the fMRI session.
Brainsci 13 00668 g001
Figure 2. Behavioral results. Left: % of disgust responses for participants posing disgust/neutral/happiness expression. Right: % of happiness responses for participants in happiness/neutral/disgust expression. * = p < 0.005, bars = standard errors.
Figure 2. Behavioral results. Left: % of disgust responses for participants posing disgust/neutral/happiness expression. Right: % of happiness responses for participants in happiness/neutral/disgust expression. * = p < 0.005, bars = standard errors.
Brainsci 13 00668 g002
Figure 3. Incongruent pose vs. neutral pose one-sample t-test; p < 0.05, FWE corrected. Functional results are shown on the SPM12 template; color bars represent T-values.
Figure 3. Incongruent pose vs. neutral pose one-sample t-test; p < 0.05, FWE corrected. Functional results are shown on the SPM12 template; color bars represent T-values.
Brainsci 13 00668 g003
Figure 4. EC correlation for the contrast “posing happiness and perceiving happiness” vs. “posing happiness and perceiving disgust”; cluster size threshold k > 30, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Figure 4. EC correlation for the contrast “posing happiness and perceiving happiness” vs. “posing happiness and perceiving disgust”; cluster size threshold k > 30, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Brainsci 13 00668 g004
Figure 5. FS correlation with the contrast “posing disgust and perceiving happiness” vs. “posing and perceiving happiness”; cluster size threshold k > 31, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Figure 5. FS correlation with the contrast “posing disgust and perceiving happiness” vs. “posing and perceiving happiness”; cluster size threshold k > 31, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Brainsci 13 00668 g005
Figure 6. PD correlation with the contrast “perceived happiness” vs. “perceived disgust”; cluster size threshold k > 31, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Figure 6. PD correlation with the contrast “perceived happiness” vs. “perceived disgust”; cluster size threshold k > 31, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Brainsci 13 00668 g006
Figure 7. PD correlation with the contrast “posing neutral and perceiving happiness” vs. “posing neutral and perceiving disgust”; cluster size threshold k > 27, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Figure 7. PD correlation with the contrast “posing neutral and perceiving happiness” vs. “posing neutral and perceiving disgust”; cluster size threshold k > 27, corrected at α < 0.05. Same overlay procedure as in Figure 3.
Brainsci 13 00668 g007
Table 1. Peak coordinates of functional activation related to incongruent conditions.
Table 1. Peak coordinates of functional activation related to incongruent conditions.
BASideClusterVoxel LevelMNI
Coordinates
Talairach
Coordinates
Brain Areas KTxyzxyz
Incongruent vs. Neutral
Precentral gyrus, postcentral gyrus4, 6r5312.1946−83846−635
Cerebellum l498.76−18−60−22−18−59−16
6.18−38−56−34−38−56−34
Cerebellum r288.2522−602622−5727
7.5930−60−3030−59−22
Postcentral gyrus, inferior
parietal lobule, superior
temporal gyrus
40, 41, 42r218.1462−201461−1914
7.3162−281861−2618
6.0854−322253−3022
Thalamus r67.6214−8214−82
Operculum, precentral gyrus,
superior temporal gyrus, insula
6, 13, 22, 43, 44l397.4−50−1210−50−1110
7.16−50−46−50−46
6.69−5846−5745
Superior temporal gyrus,
operculum, insula, precentral gyrus
6, 13, 22r277.362026102
6.8846−81046−710
6.15504−2504−2
Cerebellum r287.296−4626−157
6.324662760
Precentral gyrus6l137.18−42−1242−42−1039
Inferior parietal lobule40l26.16−58−2822−57−2622
Cerebellum 15.942−36−22−350
Areas of significant changes in fMRI signal for the contrast of “incongruent pose” vs. “neutral pose”; BA = Brodmann area; l = left; r = right; FWE corrected.
Table 2. Peak coordinates of positive correlation with the empathic concern subscale.
Table 2. Peak coordinates of positive correlation with the empathic concern subscale.
BASideClusterVoxel
Level
MNI
Coordinates
Talairach
Coordinates
Brain Areas KTxyzxyz
Hh vs. Hd
Precuneus, superior parietal lobule7l344.53−10−7638−10−7239
4.38−14−7250−14−6749
4.14−22−6438−22−6038
Areas of significant correlation with EC scores for the contrast “posing happiness and perceiving happiness” vs. “posing happiness and perceiving disgust”; BA = Brodmann area; l = left; r = right; cluster size threshold k > 30, corrected at α < 0.05.
Table 3. Peak coordinates of positive correlation with the Fantasy-Empathy subscale.
Table 3. Peak coordinates of positive correlation with the Fantasy-Empathy subscale.
BASideClusterVoxel
Level
MNI
Coordinates
Talairach
Coordinates
Brain Areas KTxyzxyz
Dh vs. Hh
Anterior insula, inferior frontal gyrus45, 47r164.94220242191
3.795028650274
Postcentral gyrus3l244.81−38−3654−38−3251
Inferior temporal gyrus37l354.69−50−64−10−50−62−5
Superior temporal gyrus l364.48−46−5218−46−5019
4.39−42−4434−42−4133
4.1−54−4822−53−4523
Caudate nucleus r214.24140101409
3.7722−121822−1117
Areas of significant correlation with FS scores for the contrast “posing disgust and perceiving happiness” vs. “posing and perceiving happiness”; BA = Brodmann area; l = left; r = right; cluster size threshold k > 31, corrected at α < 0.05.
Table 4. Peak coordinates of positive correlation with the personal distress subscale.
Table 4. Peak coordinates of positive correlation with the personal distress subscale.
BASideClusterVoxel
Level
MNI
Coordinates
Talairach
Coordinates
Brain Areas KTxyzxyz
h vs. d
Cerebellum r305.7630−40−3030−40−23
Cuneus, superior parietal
lobule, angular gyrus
7, 39r1045.4410−763810−7239
4.8118−684618−6446
4.8126−644626−6045
Pre- and post-central gyrus l364.9−50−2050−50−1747
4.26−34−1242−34−1039
Cuneus, lingual gyrus, PCC, cerebellum18, 19r1314.866−72146−6916
4.0918−64−618−62−2
3.9518−52−1818−51−13
Inferior frontal gyrus,
anterior insula, postcentral gyrus
43l694.82−38814−38812
4.78−54−418−53−317
4.52−46−1618−46−1517
Middle and superior
occipital cortex
l394.6−26−8022−26−7624
4.34−30−8814−30−8517
Middle occipital gyrus,
fusiform gyrus
19l334.14−46−76−2−46−74−2
4.12−38−72−18−38−71−12
4.12−30−80−14−30−78−8
Areas of significant correlation with PD scores for the contrast “perceived happiness” vs. “perceived disgust”; BA = Brodmann area; l = left; r = right; cluster size threshold k > 31, corrected at α < 0.05.
Table 5. Peak coordinates of positive correlation with the Personal Distress subscale.
Table 5. Peak coordinates of positive correlation with the Personal Distress subscale.
BASideClusterVoxel
Level
MNI
Coordinates
Talairach
Coordinates
Brain Areas KTxyzxyz
Nh vs. Nd
Lingual gyrus, cerebellum18r1444.8922−72−622−70−2
4.2930−68−3030−67−22
4.1222−76−1822−74−11
Cerebellum r524.6522−40−3022−40−23
4.0318−52−1818−51−13
3.7422−44−1422−43−10
Lingual gyrus, cerebellum18l564.31−2−76−22−2−75−15
4.06−14−76−26−14−75−18
3.87−30−72−30−30−71−22
Areas of significant correlation with PD scores for the contrast “posing neutral and perceiving happiness” vs. “posing neutral and perceiving disgust”; BA = Brodmann area; l = left; r = right; cluster size threshold k > 27, corrected at α < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Benuzzi, F.; Ballotta, D.; Casadio, C.; Zanelli, V.; Porro, C.A.; Nichelli, P.F.; Lui, F. “When You’re Smiling”: How Posed Facial Expressions Affect Visual Recognition of Emotions. Brain Sci. 2023, 13, 668. https://doi.org/10.3390/brainsci13040668

AMA Style

Benuzzi F, Ballotta D, Casadio C, Zanelli V, Porro CA, Nichelli PF, Lui F. “When You’re Smiling”: How Posed Facial Expressions Affect Visual Recognition of Emotions. Brain Sciences. 2023; 13(4):668. https://doi.org/10.3390/brainsci13040668

Chicago/Turabian Style

Benuzzi, Francesca, Daniela Ballotta, Claudia Casadio, Vanessa Zanelli, Carlo Adolfo Porro, Paolo Frigio Nichelli, and Fausta Lui. 2023. "“When You’re Smiling”: How Posed Facial Expressions Affect Visual Recognition of Emotions" Brain Sciences 13, no. 4: 668. https://doi.org/10.3390/brainsci13040668

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop