Next Article in Journal
Impacts of Self-Administered 3,4-Methylenedioxypyrovalerone (MDPV) Alone, and in Combination with Caffeine, on Recognition Memory and Striatal Monoamine Neurochemistry in Male Sprague Dawley Rats: Comparisons with Methamphetamine and Cocaine
Previous Article in Journal
Cognitive Biases and Socio-Occupational Functioning Mediate the Relationship between Executive Functions and the Severity of Psychopathology among Young Adults with Psychotic-like Experiences: 1-Year Follow-Up Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Left Amygdala and Right Frontoparietal Cortex Support Emotional Adaptation Aftereffects

1
Philosophy and Social Science Laboratory of Reading and Development in Children and Adolescents, South China Normal University, Ministry of Education, 510631 Guangzhou, China
2
School of Psychology, South China Normal University, 510631 Guangzhou, China
3
Guangdong Key Laboratory of Mental Health and Cognitive Science, South China Normal University, 510631 Guangzhou, China
4
School of Psychology, Guizhou Normal University, 550025 Guiyang, China
*
Author to whom correspondence should be addressed.
Brain Sci. 2024, 14(3), 257; https://doi.org/10.3390/brainsci14030257
Submission received: 1 February 2024 / Revised: 29 February 2024 / Accepted: 4 March 2024 / Published: 6 March 2024
(This article belongs to the Section Social Cognitive and Affective Neuroscience)

Abstract

:
Adaptation aftereffects—in which prolonged prior experience (adaptation) can bias the subsequent judgment of ambiguous stimuli—are a ubiquitous phenomenon. Numerous studies have found behaviorally stable adaptation aftereffects in a variety of areas. However, it is unclear which brain regions are responsible for this function, particularly in the case of high-level emotional adaptation aftereffects. To address this question, the present study used fMRI technology to investigate the neural mechanism of emotional adaptation aftereffects. Consistent with previous studies, we observed typical emotional adaptation effects in behavior. Specifically, for the same morphed facial images, participants perceived increased sadness after adapting to a happy facial image and increased happiness after adapting to a sad facial image. More crucially, by contrasting neural responses to ambiguous morphed facial images (i.e., facial images of intermediate morph levels) following adaptation to happy and sad expressions, we demonstrated a neural mechanism of emotional aftereffects supported by the left amygdala/insula, right angular gyrus, and right inferior frontal gyrus. These results suggest that the aftereffects of emotional adaptation are supported not only by brain regions subserving emotional processing but also by those subserving cognitive control.

1. Introduction

Adaptation aftereffect is a phenomenon where the perception of stimuli, especially stimuli with ambiguous features, may be biased towards non-adapted features after adaptation to a recently seen feature [1,2,3]. Research on adaptation aftereffects initially concentrated on simple manipulations such as color, orientation, and motion direction [4,5,6,7]. Recently, a growing number of studies have found that aftereffects also affect high levels of perception, including gender [8,9,10], race [8,11], identity [12,13,14,15,16], and emotion [17,18,19,20,21].
For high-level adaptation, emotional adaptation aftereffects have received great attention because they are critical for species survival as well as individual quality of life [22,23,24,25]. Emotional adaptation aftereffects are typically investigated by using a task in which participants are presented with an emotional stimulus (i.e., adapting stimulus) first and then judging the emotional category of the subsequent morphed stimuli. The phenomenon where participants tend to judge the emotions of morphed stimuli in the opposite direction to those of adapting stimuli is called the emotional adaptation aftereffect. For instance, long-term exposure to a happy face can lead to a neutral face appearing sad afterwards [20]. By using the emotional adaptation paradigm, numerous studies have shown that emotional adaptation aftereffects can be consistently elicited in different populations, regardless of their age, gender, and race [17,21,22,26,27,28,29]. Moreover, emotional adaptation aftereffects are quite robust, even when the duration of adaptation is very short [30,31] or the faces used as adapting stimuli are outside awareness [32,33] or covered [23,28,34]. In addition, emotional adaptation aftereffects occur across identities [22,35,36,37,38,39], levels [40,41], and modalities [20,42,43,44,45,46,47].
Despite the accumulating behavioral evidence for emotional adaptation aftereffects, the understanding of the neural mechanisms is relatively sparse. For example, utilizing high-temporal-resolution electroencephalography (EEG) technologies, Cheal et al. [17] found that after subjects adapted to an emotional stimulus (e.g., a happy or sad facial image), physically identical neutral test stimuli not only showed a behavioral perceptual bias (i.e., emotional adaptation aftereffect) but also resulted in N170 latency differences. The emotional adaptation effect has also been reported in the form of an increased late component between 300 and 400 ms in a magnetoencephalography (MEG) study [48]. In addition, Wang et al. [20] investigated cross-modal emotional adaptation aftereffects and found robust P1-N170-N2-Plate event-related potential (ERP) waveforms in both hemispheres. These results indicate that emotional adaptation aftereffects occur in both the early and late stages of emotion processing. Nevertheless, due to the relatively low spatial resolution of EEG and MEG technologies, it is still unclear which brain regions are responsible for emotional adaptation aftereffects.
In contrast, functional magnetic resonance imaging (fMRI) technology offers superior spatial resolution, enabling brain activity to be pinpointed with greater anatomical precision, making it possible to examine which brain regions are associated with emotional adaptation aftereffects. To our knowledge, only one study has explored the neural mechanisms of emotional adaptation effects by using fMRI. Specifically, Furl and colleagues [1] examined how brain activity associated with the perception of expression and identity categories in ambiguous morphed faces was influenced by prior adaptation. For expression categorization, the adaptation aftereffect was associated with heightened right medial temporal cortex activity, specifically when subjects perceived the non-adapted emotion category [1]. Notably, happy and fearful adaptation conditions, which might elicit opposite aftereffects [18], were combined as a single condition to contrast with the non-adapted condition in this study. This design would probably cancel out the two opposite effects. Therefore, brain regions related to emotional adaptation aftereffects should be further specified.
To explore brain regions for emotional adaptation aftereffects, we used fMRI technology and the classic emotional adaptation paradigm in which we first presented participants with an emotional facial image and then had them judge the facial expression of the subsequent morphed image. For behavioral data, we used a psychometric function to fit the subjects’ behavioral responses. We hypothesized that participants would tend to judge the subsequent morphed images as sad after adapting to a happy facial image or judge them as happy after adapting to a sad facial image. With the imaging data, we examined the neural mechanisms of emotional adaptation aftereffects by performing whole-brain activation analysis, in which the neural responses to test images with ambiguous expressions were compared across the happy, neutral, and sad adaptation conditions (i.e., stimulus-based analysis). These results were further confirmed by conducting a perception-based region-of-interest (ROI) analysis (i.e., the happy, neutral, and sad conditions were defined based on subjects’ responses). Based on previous ERP findings of emotional adaptation affecting both early and late components [17,20,48], we expected that emotional adaptation effects would be supported by brain regions subserving high-level cognitive control (e.g., frontoparietal regions) in addition to those subserving emotional processing (e.g., the amygdala).

2. Materials and Methods

2.1. Participants

Twenty-two native Chinese participants (mean age = 20.73 ± 2.25 years; six males) were recruited for this study. To ensure the suitable sample size for our experiment to detect effects, we used G*Power 3.1 [49] to conduct power calculations and found that 17 participants are sufficient for medium effect size (i.e., 0.25) with 0.80 power in one-way repeated-measures analysis of variance [50,51]. All participants were right-handed (mean = 73.45, SD = 18.49) [52], had normal or corrected-to-normal vision, and self-reported no history of neurological or psychiatric disorders. Before the experiment, written informed consent was obtained from all participants. All experimental procedures were approved by the Institutional Review Board of the School of Psychology at South China Normal University (SCNU-PSY-319, approved on 19 November 2018) and conducted in accordance with the relevant regulations of the Institutional Review Board. One participant was excluded from the subsequent analysis because of an excessively high rate of nonresponse trials (25.93%, fell outside of three standard deviations from the mean) during scanning.

2.2. Materials

Three facial images showing happy (happy proportion equals 1), neutral (happy proportion equals 0.5), and sad (happy proportion equals 0) expressions on one male person were selected from the Karolinska Directed Emotional Faces database (KDEF) [53] as adapting images. All test images used in the experiment were generated using WebMorph (STOIKimage, https://webmorph.org, accessed on 5 September 2018). In line with prior research [20,21,22], we employed a morphing technique to blend the sad facial image with the neutral facial image, resulting in a series of images exhibiting varying proportion of happiness ranging from 0 to 0.5. Similarly, we morphed the neutral image with the happy image to create another series of images from 0.5 to 1. In the current experiment, eight images with proportions of happiness equal to 0.15, 0.3, 0.4, 0.5, 0.6, 0.7, 0.85, and 1.0 were chosen as test images to generate robust and informative psychometric curves (Figure 1A). All adapting and test images were converted to grayscale and cropped into an oval shape to remove external features. In addition to the eight test images, two grating images titled 45° to the left or right of the vertical axis were included for other purposes.
We recruited another 16 Chinese participants to evaluate the emotion category (sad, neutral, or happy) and degree of arousal (1 = “very low”, 7 = “very high”) associated with each image. The results showed that the three adapting images fit into human emotional perception of happiness, neutrality, and sadness and that the degrees of arousal associated with the eight test images coincided with people’s subjective judgments (see Figure S1). Specifically, the rating scores of emotional facial images were higher than those of neutral images, and the degrees of arousal associated with all images followed a U-shaped curve.

2.3. Procedure

2.3.1. Pre-Scanning

Before the formal fMRI scan, participants underwent a brief practice session to familiarize themselves with the experimental procedure. The images utilized in the practice session were not presented during the fMRI scan. After the practice, all participants were instructed to rest for 10 min to keep them in a calm state for a subsequent fMRI scan. All behavioral data were compiled using Psychtoolbox-3 (http://www.psychtoolbox.org/, accessed on 8 June 2015) in MATLAB R2013a (https://www.mathworks.com, accessed on 8 July 2018).

2.3.2. Scanning

The fMRI scan consisted of three adaptor conditions (i.e., neutral, sad, and happy adaptor conditions). Each condition consisted of two functional runs. For the first run of each condition, an extra 30 s preadaptation was included to enhance the adaptation effect and prevent interference from the preceding adaptor condition. The order of the three conditions was counterbalanced across participants. Each run consisted of 63 trials, with 7 repetitions for each of the 8 test images and the grating image. The trials were presented in a pseudorandom manner, and OPTSEQ2 (http://surfer.nmr.mgh.harvard.edu/optseq/, accessed on 8 June 2021) was applied to optimize the trial sequences.

2.3.3. Trial Procedure

Each trial began with the presentation of an adapting image for 4 s, succeeded by a 0.5 s fixation interval (interstimulus interval, ISI). Next, a test image was presented for 0.2 s (Figure 1B). Participants were instructed to accurately and swiftly determine whether the test image was happy or sad or whether the orientation of the grating image leaned towards the left or right by pressing one of two keys (the “1” key for happy or left and the “4” key for sad or right). The key assignment was counterbalanced among participants throughout the experiment. Participants were instructed to fixate on a white cross in the center of the black screen to eliminate the effects of fixation differences [30,54]. For each trial, the response time was a random jitter (i.e., fixation) lasting between 2.3 and 5.3 s (mean = 4.3 s) to enhance the design efficiency [55]. Additionally, on the basis of previous findings suggesting stronger adaptation aftereffects when the stimuli are presented in the visual periphery than at the fovea [20,28], all images were presented on the left side of the central fixation cross throughout the experiment.

2.4. Acquisition of Imaging Data

The MRI data were acquired using a 3.0 T Siemens MRI scanner located at the MRI Center at South China Normal University. Functional images were obtained through a single-shot T2*-weighted gradient-echo echo-planar imaging (EPI) sequence (58 axial slices, repetition time (TR)/echo time (TE)/θ = 2000 ms/30 ms/90°, field of view (FOV) = 224 × 224 mm, matrix size = 112×112, slice thickness = 2.0 mm), resulting in a voxel size of 2.0 × 2.0 × 2.0 mm. Anatomical images were acquired with a T1-weighted three-dimensional gradient-echo pulse sequence (176 sagittal slices, TR/TE/θ = 2530 ms/1.94 ms/7°, FOV = 256 × 256 mm, matrix size = 256 × 256, slice thickness = 1.0 mm). Anatomical magnetization-prepared rapid gradient-echo (MPRAGE) images were collected with 0.5 × 0.5 × 1.0 mm resolution.

2.5. Analysis of Behavioral Data

To measure the emotional adaptation aftereffects, we initially computed the fraction of happy responses to the individual test images within each adaptation condition for every participant. Subsequently, the fractions from all participants were averaged to obtain the mean happy response fractions of each test image across the three adaptation conditions. Next, the mean fractions of happy responses were plotted against the proportions of happiness in the morphed test image. These results were then fitted with a sigmoidal function for each condition in the form of f(x) = 1/[1 + ea(xb)], where b represents chance performance [50% point of the psychometric function, i.e., the point of subjective equality (PSE)] and a/4 determines the slope of the function at the PSE. The amplitude of the adaptation aftereffect was equal to the PSE of emotional adaptation (happy or sad adaptor) condition minus the baseline (neutral adaptor) condition. The positive and negative values represent the psychometric curve shifting to the right and left, respectively. In other words, more or less happiness judgments were made compared to the baseline. Finally, the significance of the adaptation aftereffect was assessed through one-sample t tests.

2.6. Image Preprocessing and Activation Analysis

Image preprocessing was performed using FEAT (FMRI Expert Analysis Tool) Version 6.00 in FSL (FMRIB’s Software Library, http://www.fmrib.ox.ac.uk/fsl, accessed on 3 March 2020). To achieve T1 signal equilibrium, the first four volumes in each time series were discarded. Subsequently, the remaining images were stripped of non-brain tissue using the brain extraction tool [56] and realigned using MCFLIRT [57]. All participants were confirmed to have no translational movement parameters exceeding 1 voxel in any direction for any run. A 5 mm full-width-at-half-maximum (FWHM) Gaussian kernel and a nonlinear high-pass filter with a cutoff of 100 s were, respectively, used for spatial smoothing and temporal filtering of the functional data. A two-step registration process was employed to register the functional images to standard Montreal Neurological Institute (MNI) space: first from the functional images to the MPRAGE structural images and then to the MNI template [58]. The second step of registration was further refined using FNIRT non-linear registration [59,60].
The analysis was conducted in three levels. At the first level, a general linear model (GLM) was applied to the preprocessed data for each participant and each run. The regressors for the GLM were generated by convolving the event onsets and durations with a double-gamma hemodynamic response function. According to previous research [1,20], we divided the 8 test images into three conditions: happy (proportion of happiness: 0.7, 0.85, and 1.0), neutral (proportion of happiness: 0.4, 0.5, and 0.6), and sad (proportion of happiness: 0.15 and 0.3). The preadaptation images, adapting images, and grating images were modeled as nuisance variables to avoid their potential confounding effects. The fixation was not explicitly modeled and consequently served as an implicit baseline. To improve statistical sensitivity, the 6 fundamental motion parameters and their temporal derivatives were included as covariates of no interest. Following previous studies [1,17], we focused on the effects of emotional adaptation on the perception of images with ambiguous expressions (i.e., the neutral condition) in the subsequent analysis. Thus, the contrast of the neutral condition was computed for each run and each participant.
A second-level analysis was then conducted for each participant by concatenating the imaging data from all six runs using a fixed-effects model. In the third-level analysis, group activations were obtained by using Simple OLS (ordinary least square, FSL’s local analysis of mixed effects). All reported group images were thresholded with a height threshold of Z > 2.6 (i.e., p < 0.005) and a cluster probability of p < 0.05, corrected for whole-brain multiple comparisons using Gaussian random field theory [61].

2.7. ROI Analysis

An ROI analysis was further performed to confirm the results of the above stimulus-based analysis [i.e., the same set of morphed facial images (proportion of happiness: 0.4, 0.5, and 0.6) was used in the three adaptation conditions]. In the ROI analysis, a perception-based analysis was used because there is evidence that neural activation during the adaptation paradigm depends on the perceptual bias of participants [17,62,63] and that adaptation aftereffects are more pronounced in perception-based analysis than in stimulus-based analysis [13]. Following previous studies [13,17], three perception-based conditions were constructed: neutral test trials following a neutral adapting image (hereinafter, neutral tests following a neutral adaptor), neutral trials with sad perception following a happy adapting image (hereinafter, sad perception following a happy adaptor) and neutral trials with happy perception following a sad adapting image (hereinafter, happy perception following a sad adaptor).
A total of 4 ROIs were defined in the current study. First, to further confirm the results of the stimulus-based analysis, three ROIs (i.e., the left amygdala/insula, right angular gyrus, and right inferior frontal gyrus) were functionally defined based on the activation clusters found in whole-brain activation analysis. In addition, to explore the effect of face perception on emotional adaptation aftereffects, the right fusiform face area (FFA), a key brain region for face perception, was additionally defined as a sphere of 6 mm radius around the coordinates (MNI: 40, −55, −10) reported in Kanwisher et al. [64].
In the abovementioned four ROIs, percent signal changes were then calculated by using the following formula: [contrast image/(mean of run)] × ppheight × 100%. Specifically, the contrast image was extracted from each perception-based condition of the fitted GLM. The mean of the run was calculated as the mean of the functional data of the fitted GLM. The variable “ppheight” represented the peak height of the hemodynamic response versus the baseline level of activity [65]. Finally, one-way repeated-measures analysis of variance (ANOVA) was performed on all ROIs separately to investigate the differences among conditions. If the main effect was significant, Bonferroni post hoc test was performed.

3. Results

3.1. Behavioral Results

The behavioral responses in the three adaptation conditions are illustrated in Figure 2A. The sigmoidal function fitted the participant responses well, with R2 values (mean ± SD) of 0.98 ± 0.03 in the happy, 0.99 ± 0.01 in the neutral, and 0.97 ± 0.04 in the sad adaptor conditions. After adapting to the happy facial image, participants perceived happy expressions less frequently, and the psychometric curve shifted to the right compared to the neutral adaptor. After adapting to the sad facial image, participants perceived happy expressions more frequently, and the psychometric curve shifted to the left compared to the neutral adaptor.
To quantify the aftereffects, we calculated the PSE shift relative to the baseline condition (neutral adaptor) for the happy and sad psychometric curves of all participants. As demonstrated in Figure 2B, both the happy adaptor (mean PSE shift = 0.13, t (20) = 6.77, p < 0.001, Cohen’s d = 3.03) and the sad adaptor (mean PSE shift = −0.13, t (20) = −6.15, p < 0.001, Cohen’s d = −2.75) produced significant adaptation aftereffects. Specifically, compared to the baseline condition, for test stimuli with identical physical properties, participants made opposite reactions under happy and sad adaptation conditions, i.e., more frequent responses to sadness and happiness, respectively.

3.2. fMRI Results

Whole-brain activation analysis was conducted to explore neural activations for emotional adaptation aftereffects. Following previous studies, we focused on the images with intermediate morph levels (i.e., the neutral images) in this analysis because the emotional adaptation aftereffect was greatest for test stimuli with intermediate morph levels (see Figure 2). Activation analysis revealed that all three adaptation conditions elicited activation in an extensive neural network compared to the fixation, including the bilateral insular cortex, superior frontal gyrus, middle frontal gyrus, inferior frontal gyrus, precentral gyrus, postcentral gyrus, superior parietal lobule, supramarginal gyrus, lateral occipital cortex, supplementary motor cortex, cingulate gyrus, precuneus, lingual gyrus, and fusiform cortex (Figure 3). Further comparisons between different adaptation conditions revealed that the happy adaptation condition showed stronger activation than the sad adaptation condition in the left amygdala/insula, right angular gyrus, and right pars opercularis of the inferior frontal gyrus (Figure 3). No significant activation differences were found for other contrasts.
Perception-based ROI analysis was also conducted to confirm the activation results found in the stimulus-based analysis described above. As mentioned in the Methods, the three perception-based conditions (i.e., sad perception following a happy adaptor, neutral tests following a neutral adaptor, and happy perception following a sad adaptor) were constructed based on participants’ subjective perception. The results showed significant emotional adaptation aftereffects in the three brain regions identified in the aforementioned whole-brain analysis, but no such effects were found in the brain region responsible for face processing (i.e., the right FFA; F (2, 60) = 0.87, p = 0.424, η2p = 0.028) (Figure 4). Specifically, the main effect of emotional adaptation was significant in the left amygdala/insula (F (2, 60) = 12.63, p < 0.001, η2p = 0.296; qFDR < 0.001), right angular gyrus (F (2, 60) = 3.72, p = 0.03, η2p = 0.110; qFDR = 0.04), and right pars opercularis of the inferior frontal gyrus (F (2, 60) = 5.65, p = 0.006, η2p = 0.158; qFDR = 0.012). Post hoc comparisons revealed significantly higher activation in the perception-based happy condition than in the perception-based sad condition in all three ROIs (left amygdala/insula: t (20) = 5.18, p < 0.001, Cohen’s d = 1.64; right AG: t (20) = 2.76, p = 0.033, Cohen’s d = 0.87; right IFGpo: t (20) = 3.19, p = 0.004, Cohen’s d = 1.01) and in the perception-based neutral condition in the amygdala/insula (t (20) = 2.65, p = 0.014, Cohen’s d = 0.84; Figure 4).

4. Discussion

Using fMRI technology and the classic emotional adaptation paradigm, this study examined the neural mechanism of emotional adaptation aftereffects. Consistent with previous studies [18,21,22,23,28,66,67], the behavioral results showed a typical emotional adaptation aftereffect. Specifically, participants tended toward sad judgments of the subsequent morphed images after adapting to a happy facial image and tended toward happy judgments after adapting to a sad facial image. More importantly, imaging results showed that emotional adaptation aftereffects were found in brain regions subserving emotion processing (left amygdala/insula) and cognitive control (right inferior frontal gyrus pars opercularis and right angular gyrus), but not in brain regions subserving face perception (e.g., the right fusiform face area). These findings imply that a priori adaptation, which behaviorally biases emotional judgments against non-adapted categories, requires high-level emotion processing and cognitive control processing.
Our study provides precise spatial locations of brain regions associated with emotional adaptation aftereffects. As discussed in the Introduction, much behavioral research has shown the existence of emotional adaptation aftereffects [18,21,22,23,28,66,67], and numerous electrophysiological studies have examined the time course of these aftereffects [17,20,48]. Here, we specified which brain regions support emotional adaptation aftereffects. Specifically, by comparing neural responses to ambiguous morphed facial images (i.e., facial images of intermediate morph levels) following different adapting emotions, the present study revealed that the same facial images produced greater activation in the left amygdala/insula, right angular gyrus, and right inferior frontal gyrus after adapting to a happy facial image than after adapting to a sad facial image. These results were further confirmed by taking participants’ behavioral responses into account (i.e., perception-based ROI analysis). In contrast, emotional adaptation aftereffects were not found in the key brain region for face perception (i.e., the right fusiform face area). These results indicate that the emotional adaptation aftereffects are supported by brain regions subserving emotional processing and cognitive control but not by those subserving face perception.
The amygdala, especially the left amygdala, has long been thought to play an important role in emotional processing [68,69,70]. Furthermore, a meta-analysis found that the left amygdala was the brain structure consistently recruited by emotional decision-making, regardless of task instructions [71]. As a cortical center for visceral information processing and interoception, the insula is thought to be crucial in both emotional experience and subjective perception [72,73,74]. Furthermore, the insula, as part of the salience network, is important for the rapid detection of personally relevant or otherwise significant emotional cues in the environment [75,76,77]. Our findings showing the involvement of brain regions subserving emotional processing are consistent with previous findings that emotional adaptation aftereffects were associated with high-level emotional processing [22,30,31,54,78] and that the magnitude of emotional adaptation aftereffects was positively correlated with the emotional intensity of the adapting faces [19,27,30,79].
In addition, the inferior frontal gyrus is also considered to have a prominent role in the processing of facial expressions [80,81]. The gray matter volume of the right inferior frontal gyrus was closely related to accurate recognition of facial expressions [82], while a lesion of the structure impairs this function [83]. Building a meta-analytic connectivity model, a meta-analysis comprising 96 fMRI and positron emission tomography (PET) studies recently identified a functionally co-activating neural network that includes brain areas like the amygdala and inferior frontal gyrus [84]. Moreover, the right angular gyrus is part of the lateral parietal cortex and serves as a multimodal integration region [85,86,87,88]. It participates in different cognitive tasks through potential connections to different core cognitive networks [89]. More importantly, the angular gyrus and the inferior frontal gyrus together form part of the frontoparietal control network [90], which subserves cognitive control and is critical for coordinating behavior in a rapid, accurate, and flexible goal-driven manner [91]. The frontoparietal control network flexibly couples with and regulates other functional brain networks according to the goal of the current task [92,93]. Our results showing the involvement of brain regions in the frontoparietal control network suggest the critical role of cognitive control in emotional adaptation aftereffects.
Our results have important implications for the theoretical explanation of emotional adaptation aftereffects. Specifically, our results argue against the view that the angle or orientation of the mouth is sufficient to explain the perceived emotional changes in faces [40,41,94]. Based on this view, we inferred that differential activation across adaptation conditions may be found in low-level face perception brain regions (such as the right FFA). However, our actual findings from the stimulus-based whole-brain activation analysis as well as the perception-based ROI analysis did not support this expectation. Instead, our findings could be explained by the model of cognitive control of emotion. This model held that regulating emotional responses is essentially a process of cognitive control of emotion, and different emotion regulation strategies (i.e., cognitive control processes) will affect some or all stages of emotion production [95]. In line with this model, both the stimulus-based whole-brain activation analysis as well as the perception-based ROI analysis in our study showed the involvement of emotional processing brain regions (the left amygdala/insula) and components of the frontoparietal control network (the right angular gyrus, and right inferior frontal gyrus) in emotional adaptation aftereffects.
Two limitations of this study should be discussed. First, because there was a higher proportion of female participants, the study cannot completely eliminate the impact of gender differences on the neural mechanisms of emotional adaptation. Therefore, future studies should replicate our findings by recruiting participants with a more balanced male-to-female ratio. Second, although behavioral results showed significant differences in PSE between the emotion adaptation conditions (i.e., happy and sad) and the neutral adaptation condition, there were no differences observed in neural activation in any brain region. This discrepancy may be attributable to the relatively small number of trials (i.e., facial images of intermediate morph levels), which might not provide a sufficient signal-to-noise ratio in BOLD responses. Future studies should attempt to replicate our results using a larger number of trials [1,96].
In summary, using a classical emotional adaptation paradigm and fMRI technology, we found that prior adaptation experiences biased emotion judgment toward the non-adapted category. More importantly, emotional adaptation aftereffects were supported by brain regions subserving emotional processing and cognitive control but not by those subserving low-level face perception. These results suggest that emotional adaptation aftereffects are a high-level phenomenon.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/brainsci14030257/s1, Figure S1: Emotion classification and arousal evaluation.

Author Contributions

Conceptualization, L.M.; Data curation, X.S., R.F. and N.J.; Formal analysis, X.S. and H.L.; Funding acquisition, L.M.; Investigation, R.F.; Methodology, H.L. and L.M.; Software, N.J.; Supervision, L.M.; Visualization, X.S. and H.L.; Writing—original draft, X.S., R.F. and L.M.; Writing—review and editing, X.S., A.L., J.Y. and L.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Natural Science Foundation of China [32271098], and the Basic and Applied Basic Research Foundation of Guangdong Province [2022A1515011082].

Institutional Review Board Statement

All experimental procedures were approved by the Institutional Review Board of the School of Psychology at South China Normal University (SCNU-PSY-319, date: 19 November 2018) and conducted in keeping with the relevant regulations of the Institutional Review Board.

Informed Consent Statement

Written informed consent was obtained from all participants.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request. The data are not publicly available due to privacy and ethical restrictions.

Conflicts of Interest

All authors declare no conflicts of interest.

References

  1. Furl, N.; Van Rijsbergen, N.J.; Treves, A.; Dolan, R.J. Face Adaptation Aftereffects Reveal Anterior Medial Temporal Cortex Role in High Level Category Representation. NeuroImage 2007, 37, 300–310. [Google Scholar] [CrossRef]
  2. Webster, M.A.; Kaping, D.; Mizokami, Y.; Duhamel, P. Adaptation to Natural Facial Categories. Nature 2004, 428, 557–561. [Google Scholar] [CrossRef]
  3. Witthoft, N.; Sha, L.; Winawer, J.; Kiani, R. Sensory and Decision-Making Processes Underlying Perceptual Adaptation. J. Vis. 2018, 18, 10. [Google Scholar] [CrossRef]
  4. Clifford, C.W.G. Perceptual Adaptation: Motion Parallels Orientation. Trends Cogn. Sci. 2002, 6, 136–143. [Google Scholar] [CrossRef] [PubMed]
  5. Gibson, J.J.; Radner, M. Adaptation, after-Effect and Contrast in the Perception of Tilted Lines. I. Quantitative Studies. J. Exp. Psychol. 1937, 20, 453–467. [Google Scholar] [CrossRef]
  6. Huk, A.C.; Ress, D.; Heeger, D.J. Neuronal Basis of the Motion Aftereffect Reconsidered. Neuron 2001, 32, 161–172. [Google Scholar] [CrossRef]
  7. McCollough, C. Color Adaptation of Edge-Detectors in the Human Visual System. Science 1965, 149, 1115–1116. [Google Scholar] [CrossRef]
  8. Davidenko, N.; Vu, C.Q.; Heller, N.H.; Collins, J.M. Attending to Race (or Gender) Does Not Increase Race (or Gender) Aftereffects. Front. Psychol. 2016, 7, 909. [Google Scholar] [CrossRef]
  9. Pond, S.; Kloth, N.; McKone, E.; Jeffery, L.; Irons, J.; Rhodes, G. Aftereffects Support Opponent Coding of Face Gender. J. Vis. 2013, 13, 16. [Google Scholar] [CrossRef] [PubMed]
  10. Zhao, C.; Seriès, P.; Hancock, P.J.B.; Bednar, J.A. Similar Neural Adaptation Mechanisms Underlying Face Gender and Tilt Aftereffects. Vis. Res. 2011, 51, 2021–2030. [Google Scholar] [CrossRef]
  11. Jaquet, E.; Rhodes, G.; Hayward, W.G. Race-Contingent Aftereffects Suggest Distinct Perceptual Norms for Different Race Faces. Vis. Cogn. 2008, 16, 734–753. [Google Scholar] [CrossRef]
  12. Armann, R.; Jeffery, L.; Calder, A.J.; Rhodes, G. Race-Specific Norms for Coding Face Identity and a Functional Role for Norms. J. Vis. 2011, 11, 9. [Google Scholar] [CrossRef]
  13. Fox, C.; Moon, S.; Iaria, G.; Barton, J. The Correlates of Subjective Perception of Identity and Expression in the Face Network: An fMRI Adaptation Study. NeuroImage 2009, 44, 569–580. [Google Scholar] [CrossRef] [PubMed]
  14. Jeffery, L.; Rhodes, G. Insights into the Development of Face Recognition Mechanisms Revealed by Face Aftereffects: Insights into the Development of Face Recognition. Br. J. Psychol. 2011, 102, 799–815. [Google Scholar] [CrossRef] [PubMed]
  15. Ross, D.A.; Deroche, M.; Palmeri, T.J. Not Just the Norm: Exemplar-Based Models Also Predict Face Aftereffects. Psychon. Bull. Rev. 2014, 21, 47–70. [Google Scholar] [CrossRef] [PubMed]
  16. Walther, C.; Schweinberger, S.R.; Kovács, G. Adaptor Identity Modulates Adaptation Effects in Familiar Face Identification and Their Neural Correlates. PLoS ONE 2013, 8, e70525. [Google Scholar] [CrossRef]
  17. Cheal, J.L.; Heisz, J.J.; Walsh, J.A.; Shedden, J.M.; Rutherford, M.D. Afterimage Induced Neural Activity during Emotional Face Perception. Brain Res. 2014, 1549, 11–21. [Google Scholar] [CrossRef]
  18. Rutherford, M.D.; Chattha, H.M.; Krysko, K.M. The Use of Aftereffects in the Study of Relationships among Emotion Categories. J. Exp. Psychol. Hum. Percept. Perform. 2008, 34, 27–40. [Google Scholar] [CrossRef] [PubMed]
  19. Skinner, A.L.; Benton, C.P. Anti-Expression Aftereffects Reveal Prototype-Referenced Coding of Facial Expressions. Psychol. Sci. 2010, 21, 1248–1253. [Google Scholar] [CrossRef]
  20. Wang, X.; Guo, X.; Chen, L.; Liu, Y.; Goldberg, M.E.; Xu, H. Auditory to Visual Cross-Modal Adaptation for Emotion: Psychophysical and Neural Correlates. Cereb. Cortex 2016, 27, bhv321. [Google Scholar] [CrossRef]
  21. Ying, H.; Xu, H. Adaptation Reveals That Facial Expression Averaging Occurs during Rapid Serial Presentation. J. Vis. 2017, 17, 15. [Google Scholar] [CrossRef] [PubMed]
  22. Jiang, N.; Li, H.; Chen, C.; Fu, R.; Zhang, Y.; Mei, L. The Emotional Adaptation Aftereffect Discriminates between Individuals with High and Low Levels of Depressive Symptoms. Cogn. Emot. 2022, 36, 240–253. [Google Scholar] [CrossRef] [PubMed]
  23. Luo, C.; Burns, E.; Xu, H. Association between Autistic Traits and Emotion Adaptation to Partially Occluded Faces. Vis. Res. 2017, 133, 21–36. [Google Scholar] [CrossRef] [PubMed]
  24. Rhodes, G.; Burton, N.; Jeffery, L.; Read, A.; Taylor, L.; Ewing, L. Facial Expression Coding in Children and Adolescents with Autism: Reduced Adaptability but Intact Norm-Based Coding. Br. J. Psychol. 2018, 109, 204–218. [Google Scholar] [CrossRef] [PubMed]
  25. Rutherford, M.D.; Troubridge, E.K.; Walsh, J. Visual Afterimages of Emotional Faces in High Functioning Autism. J. Autism Dev. Disord. 2012, 42, 221–229. [Google Scholar] [CrossRef]
  26. Burton, N.; Jeffery, L.; Skinner, A.L.; Benton, C.P.; Rhodes, G. Nine-Year-Old Children Use Norm-Based Coding to Visually Represent Facial Expression. J. Exp. Psychol. Hum. Percept. Perform. 2013, 39, 1261–1269. [Google Scholar] [CrossRef]
  27. Hong, S.W.; Yoon, K.L. Intensity Dependence in High-Level Facial Expression Adaptation Aftereffect. Psychon. Bull. Rev. 2018, 25, 1035–1042. [Google Scholar] [CrossRef]
  28. Luo, C.; Wang, Q.; Schyns, P.G.; Kingdom, F.A.A.; Xu, H. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces. PLoS ONE 2015, 10, e0145877. [Google Scholar] [CrossRef] [PubMed]
  29. Vida, M.D.; Mondloch, C.J. Children’s Representations of Facial Expression and Identity: Identity-Contingent Expression Aftereffects. J. Exp. Child Psychol. 2009, 104, 326–345. [Google Scholar] [CrossRef]
  30. Sou, K.L.; Xu, H. Brief Facial Emotion Aftereffect Occurs Earlier for Angry than Happy Adaptation. Vis. Res. 2019, 162, 35–42. [Google Scholar] [CrossRef]
  31. Xu, H.; Liu, P.; Dayan, P.; Qian, N. Multi-Level Visual Adaptation: Dissociating Curvature and Facial-Expression Aftereffects Produced by the Same Adapting Stimuli. Vis. Res. 2012, 72, 42–53. [Google Scholar] [CrossRef]
  32. Adams, W.J.; Gray, K.L.H.; Garner, M.; Graf, E.W. High-Level Face Adaptation Without Awareness. Psychol. Sci. 2010, 21, 205–210. [Google Scholar] [CrossRef]
  33. Yang, E.; Hong, S.-W.; Blake, R. Adaptation Aftereffects to Facial Expressions Suppressed from Visual Awareness. J. Vis. 2010, 10, 24. [Google Scholar] [CrossRef]
  34. Pell, P.J.; Richards, A. Cross-Emotion Facial Expression Aftereffects. Vis. Res. 2011, 51, 1889–1896. [Google Scholar] [CrossRef]
  35. Campbell, J.; Burke, D. Evidence That Identity-Dependent and Identity-Independent Neural Populations Are Recruited in the Perception of Five Basic Emotional Facial Expressions. Vis. Res. 2009, 49, 1532–1540. [Google Scholar] [CrossRef]
  36. Fox, C.J.; Barton, J.J.S. What Is Adapted in Face Adaptation? The Neural Representations of Expression in the Human Visual System. Brain Res. 2007, 1127, 80–89. [Google Scholar] [CrossRef] [PubMed]
  37. Pell, P.J.; Richards, A. Overlapping Facial Expression Representations Are Identity-Dependent. Vis. Res. 2013, 79, 1–7. [Google Scholar] [CrossRef] [PubMed]
  38. Skinner, A.L.; Benton, C.P. The Expressions of Strangers: Our Identity-Independent Representation of Facial Expression. J. Vis. 2012, 12, 12. [Google Scholar] [CrossRef]
  39. Song, M.; Zhang, S.; Shinomori, K. The Output of Human Expression System Measured by Visual Adaptation and Its Implication for the Computer Recognition System. In Proceedings of the 2009 Ninth IEEE International Conference on Computer and Information Technology, Xiamen, China, 11–14 October 2009; pp. 31–35. [Google Scholar]
  40. Dickinson, J.E.; Mighall, H.K.; Almeida, R.A.; Bell, J.; Badcock, D.R. Rapidly Acquired Shape and Face Aftereffects Are Retinotopic and Local in Origin. Vis. Res. 2012, 65, 1–11. [Google Scholar] [CrossRef]
  41. Xu, H.; Dayan, P.; Lipkin, R.M.; Qian, N. Adaptation across the Cortical Hierarchy: Low-Level Curve Adaptation Affects High-Level Facial-Expression Judgments. J. Neurosci. 2008, 28, 3374–3383. [Google Scholar] [CrossRef]
  42. Baart, M.; Vroomen, J. Recalibration of Vocal Affect by a Dynamic Face. Exp. Brain Res. 2018, 236, 1911–1918. [Google Scholar] [CrossRef]
  43. Izen, S.C.; Lapp, H.E.; Harris, D.A.; Hunter, R.G.; Ciaramitaro, V.M. Seeing a Face in a Crowd of Emotional Voices: Changes in Perception and Cortisol in Response to Emotional Information across the Senses. Brain Sci. 2019, 9, 176. [Google Scholar] [CrossRef]
  44. Izen, S.C.; Ciaramitaro, V.M. A Crowd of Emotional Voices Influences the Perception of Emotional Faces: Using Adaptation, Stimulus Salience, and Attention to Probe Audio-Visual Interactions for Emotional Stimuli. Atten. Percept. Psychophys. 2020, 82, 3973–3992. [Google Scholar] [CrossRef]
  45. Pye, A.; Bestelmeyer, P.E.G. Evidence for a Supra-Modal Representation of Emotion from Cross-Modal Adaptation. Cognition 2015, 134, 245–251. [Google Scholar] [CrossRef]
  46. Skuk, V.G.; Schweinberger, S.R. Adaptation Aftereffects in Vocal Emotion Perception Elicited by Expressive Faces and Voices. PLoS ONE 2013, 8, e81691. [Google Scholar] [CrossRef]
  47. Watson, R.; Latinus, M.; Noguchi, T.; Garrod, O.; Crabbe, F.; Belin, P. Crossmodal Adaptation in Right Posterior Superior Temporal Sulcus during Face–Voice Emotional Integration. J. Neurosci. 2014, 34, 6813–6821. [Google Scholar] [CrossRef]
  48. Furl, N.; Van Rijsbergen, N.J.; Treves, A.; Friston, K.J.; Dolan, R.J. Experience-Dependent Coding of Facial Expression in Superior Temporal Sulcus. Proc. Natl. Acad. Sci. USA 2007, 104, 13485–13489. [Google Scholar] [CrossRef]
  49. Faul, F.; Erdfelder, E.; Lang, A.-G.; Buchner, A. G*Power 3: A Flexible Statistical Power Analysis Program for the Social, Behavioral, and Biomedical Sciences. Behav. Res. Methods 2007, 39, 175–191. [Google Scholar] [CrossRef] [PubMed]
  50. Cohen, J. Quantitative Methods in Psychology: A Power Primer. Psychol. Bull. 1992, 112, 155–159. [Google Scholar] [CrossRef] [PubMed]
  51. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; reprint; Psychology Press: New York, NY, USA, 1988; ISBN 978-0-8058-0283-2. [Google Scholar]
  52. Oldfield, R.C. The Assessment and Analysis of Handedness: The Edinburgh Inventory. Neuropsychologia 1971, 9, 97–113. [Google Scholar] [CrossRef] [PubMed]
  53. Lundqvist, D.; Flykt, A.; Öhman, A. The Karolinska Directed Emotional Faces—KDEF; CD ROM from Department of Clinical Neuroscience, Psychology section, Karolinska Institutet: Stockholm, Sweden, 1998; ISBN 91-630-7164-9. [Google Scholar]
  54. Swe, D.C.; Burton, N.S.; Rhodes, G. Are Expression Aftereffects Fully Explained by Tilt Adaptation? J. Vis. 2019, 19, 21. [Google Scholar] [CrossRef]
  55. Dale, A.M. Optimal Experimental Design for Event-Related fMRI. Hum. Brain Mapp. 1999, 8, 109–114. [Google Scholar] [CrossRef]
  56. Smith, S.M. Fast Robust Automated Brain Extraction. Hum. Brain Mapp. 2002, 17, 143–155. [Google Scholar] [CrossRef]
  57. Jenkinson, M.; Bannister, P.; Brady, M.; Smith, S. Improved Optimization for the Robust and Accurate Linear Registration and Motion Correction of Brain Images. NeuroImage 2002, 17, 825–841. [Google Scholar] [CrossRef]
  58. Jenkinson, M.; Smith, S. A Global Optimisation Method for Robust Affine Registration of Brain Images. Med. Image Anal. 2001, 5, 143–156. [Google Scholar] [CrossRef]
  59. Andersson, J.L.R.; Jenkinson, M.; Smith, S. Non-Linear Optimisation; FMRIB Centre: Oxford, UK, 2007; pp. 1–16. [Google Scholar]
  60. Andersson, J.L.R.; Jenkinson, M.; Smith, S. Non-Linear Registration Aka Spatial Normalisation; FMRIB Centre: Oxford, UK, 2007; pp. 1–21. [Google Scholar]
  61. Worsley, K.J. Statistical Analysis of Activation Images. In Functional Magnetic Resonance Imaging: An Introduction to Methods; Oxford University Press: Oxford, UK, 2001; Volume 14, pp. 251–270. ISBN 978-0-19-263071-1. [Google Scholar]
  62. Cziraki, C.; Greenlee, M.W.; Kovács, G. Neural Correlates of High-Level Adaptation-Related Aftereffects. J. Neurophysiol. 2010, 103, 1410–1417. [Google Scholar] [CrossRef]
  63. Kaiser, D.; Walther, C.; Schweinberger, S.R.; Kovács, G. Dissociating the Neural Bases of Repetition-Priming and Adaptation in the Human Brain for Faces. J. Neurophysiol. 2013, 110, 2727–2738. [Google Scholar] [CrossRef]
  64. Kanwisher, N.; McDermott, J.; Chun, M.M. The Fusiform Face Area: A Module in Human Extrastriate Cortex Specialized for Face Perception. J. Neurosci. 1997, 17, 4302–4311. [Google Scholar] [CrossRef] [PubMed]
  65. Mumford, J.A.; Turner, B.O.; Ashby, F.G.; Poldrack, R.A. Deconvolving BOLD Activation in Event-Related Designs for Multivoxel Pattern Classification Analyses. NeuroImage 2012, 59, 2636–2643. [Google Scholar] [CrossRef] [PubMed]
  66. Liu, P.; Montaser-Kouhsari, L.; Xu, H. Effects of Face Feature and Contour Crowding in Facial Expression Adaptation. Vis. Res. 2014, 105, 189–198. [Google Scholar] [CrossRef] [PubMed]
  67. Matsumiya, K. Seeing a Haptically Explored Face: Visual Facial-Expression Aftereffect From Haptic Adaptation to a Face. Psychol. Sci. 2013, 24, 2088–2098. [Google Scholar] [CrossRef] [PubMed]
  68. Baas, D.; Aleman, A.; Kahn, R.S. Lateralization of Amygdala Activation: A Systematic Review of Functional Neuroimaging Studies. Brain Res. Rev. 2004, 45, 96–103. [Google Scholar] [CrossRef]
  69. Murphy, F.C.; Nimmo-Smith, I.; Lawrence, A.D. Functional Neuroanatomy of Emotions: A Meta-Analysis. Cogn. Affect. Behav. Neurosci. 2003, 3, 207–233. [Google Scholar] [CrossRef] [PubMed]
  70. Xu, P.; Peng, S.; Luo, Y.; Gong, G. Facial Expression Recognition: A Meta-Analytic Review of Theoretical Models and Neuroimaging Evidence. Neurosci. Biobehav. Rev. 2021, 127, 820–836. [Google Scholar] [CrossRef]
  71. Dricu, M.; Frühholz, S. A Neurocognitive Model of Perceptual Decision-making on Emotional Signals. Hum. Brain Mapp. 2020, 41, 1532–1556. [Google Scholar] [CrossRef]
  72. Gasquoine, P.G. Contributions of the Insula to Cognition and Emotion. Neuropsychol. Rev. 2014, 24, 77–87. [Google Scholar] [CrossRef]
  73. Gogolla, N. The Insular Cortex. Curr. Biol. 2017, 27, R580–R586. [Google Scholar] [CrossRef] [PubMed]
  74. Uddin, L.Q.; Nomi, J.S.; Hébert-Seropian, B.; Ghaziri, J.; Boucher, O. Structure and Function of the Human Insula. J. Clin. Neurophysiol. 2017, 34, 300–306. [Google Scholar] [CrossRef] [PubMed]
  75. Menon, V.; Uddin, L.Q. Saliency, Switching, Attention and Control: A Network Model of Insula Function. Brain Struct. Funct. 2010, 214, 655–667. [Google Scholar] [CrossRef]
  76. Seeley, W.W.; Menon, V.; Schatzberg, A.F.; Keller, J.; Glover, G.H.; Kenna, H.; Reiss, A.L.; Greicius, M.D. Dissociable Intrinsic Connectivity Networks for Salience Processing and Executive Control. J. Neurosci. 2007, 27, 2349–2356. [Google Scholar] [CrossRef]
  77. Zhang, Y.; Padmanabhan, A.; Gross, J.J.; Menon, V. Development of Human Emotion Circuits Investigated Using a Big-Data Analytic Approach: Stability, Reliability, and Robustness. J. Neurosci. 2019, 39, 7155–7172. [Google Scholar] [CrossRef]
  78. Butler, A.; Oruc, I.; Fox, C.J.; Barton, J.J.S. Factors Contributing to the Adaptation Aftereffects of Facial Expression. Brain Res. 2008, 1191, 116–126. [Google Scholar] [CrossRef]
  79. Minemoto, K.; Ueda, Y.; Yoshikawa, S. The Aftereffect of the Ensemble Average of Facial Expressions on Subsequent Facial Expression Recognition. Atten. Percept. Psychophys. 2022, 84, 815–828. [Google Scholar] [CrossRef] [PubMed]
  80. Jabbi, M.; Keysers, C. Inferior Frontal Gyrus Activity Triggers Anterior Insula Response to Emotional Facial Expressions. Emotion 2008, 8, 775–780. [Google Scholar] [CrossRef] [PubMed]
  81. Nomura, M.; Iidaka, T.; Kakehi, K.; Tsukiura, T.; Hasegawa, T.; Maeda, Y.; Matsue, Y. Frontal Lobe Networks for Effective Processing of Ambiguously Expressed Emotions in Humans. Neurosci. Lett. 2003, 348, 113–116. [Google Scholar] [CrossRef]
  82. Uono, S.; Sato, W.; Kochiyama, T.; Sawada, R.; Kubota, Y.; Yoshimura, S.; Toichi, M. Neural Substrates of the Ability to Recognize Facial Expressions: A Voxel-Based Morphometry Study. Soc. Cogn. Affect. Neurosci. 2016, 12, 487–495. [Google Scholar] [CrossRef]
  83. Dal Monte, O.; Krueger, F.; Solomon, J.M.; Schintu, S.; Knutson, K.M.; Strenziok, M.; Pardini, M.; Leopold, A.; Raymont, V.; Grafman, J. A Voxel-Based Lesion Study on Facial Emotion Recognition after Penetrating Brain Injury. Soc. Cogn. Affect. Neurosci. 2013, 8, 632–639. [Google Scholar] [CrossRef]
  84. Liu, M.; Liu, C.H.; Zheng, S.; Zhao, K.; Fu, X. Reexamining the Neural Network Involved in Perception of Facial Expression: A Meta-Analysis. Neurosci. Biobehav. Rev. 2021, 131, 179–191. [Google Scholar] [CrossRef]
  85. Bonnici, H.M.; Richter, F.R.; Yazar, Y.; Simons, J.S. Multimodal Feature Integration in the Angular Gyrus during Episodic and Semantic Retrieval. J. Neurosci. 2016, 36, 5462–5471. [Google Scholar] [CrossRef]
  86. Humphreys, G.F.; Lambon Ralph, M.A.; Simons, J.S. A Unifying Account of Angular Gyrus Contributions to Episodic and Semantic Cognition. Trends Neurosci. 2021, 44, 452–463. [Google Scholar] [CrossRef] [PubMed]
  87. Seghier, M.L. The Angular Gyrus: Multiple Functions and Multiple Subdivisions. Neuroscientist 2013, 19, 43–61. [Google Scholar] [CrossRef] [PubMed]
  88. Seghier, M.L. Multiple Functions of the Angular Gyrus at High Temporal Resolution. Brain Struct. Funct. 2023, 228, 7–46. [Google Scholar] [CrossRef] [PubMed]
  89. Humphreys, G.F.; Tibon, R. Dual-Axes of Functional Organisation across Lateral Parietal Cortex: The Angular Gyrus Forms Part of a Multi-Modal Buffering System. Brain Struct. Funct. 2023, 228, 341–352. [Google Scholar] [CrossRef] [PubMed]
  90. Thomas Yeo, B.T.; Krienen, F.M.; Sepulcre, J.; Sabuncu, M.R.; Lashkari, D.; Hollinshead, M.; Roffman, J.L.; Smoller, J.W.; Zöllei, L.; Polimeni, J.R.; et al. The Organization of the Human Cerebral Cortex Estimated by Intrinsic Functional Connectivity. J. Neurophysiol. 2011, 106, 1125–1165. [Google Scholar] [CrossRef] [PubMed]
  91. Marek, S.; Dosenbach, N.U.F. The Frontoparietal Network: Function, Electrophysiology, and Importance of Individual Precision Mapping. Dialogues Clin. Neurosci. 2018, 20, 133–140. [Google Scholar] [CrossRef] [PubMed]
  92. Cole, M.W.; Repovš, G.; Anticevic, A. The Frontoparietal Control System: A Central Role in Mental Health. Neuroscientist 2014, 20, 652–664. [Google Scholar] [CrossRef] [PubMed]
  93. Spreng, R.N.; Sepulcre, J.; Turner, G.R.; Stevens, W.D.; Schacter, D.L. Intrinsic Architecture Underlying the Relations among the Default, Dorsal Attention, and Frontoparietal Control Networks of the Human Brain. J. Cogn. Neurosci. 2013, 25, 74–86. [Google Scholar] [CrossRef]
  94. Dickinson, J.E.; Badcock, D.R. On the Hierarchical Inheritance of Aftereffects in the Visual System. Front. Psychol. 2013, 4, 472. [Google Scholar] [CrossRef]
  95. Ochsner, K.N.; Silvers, J.A.; Buhle, J.T. Functional Imaging Studies of Emotion Regulation: A Synthetic Review and Evolving Model of the Cognitive Control of Emotion: Functional Imaging Studies of Emotion Regulation. Ann. N. Y. Acad. Sci. 2012, 1251, E1–E24. [Google Scholar] [CrossRef]
  96. Thielscher, A.; Pessoa, L. Neural Correlates of Perceptual Choice and Decision Making during Fear–Disgust Discrimination. J. Neurosci. 2007, 27, 2908–2917. [Google Scholar] [CrossRef]
Figure 1. Stimuli and experimental design. (A) Three adapting images (i.e., sad, neutral, and happy adaptors) and eight morphed test images (i.e., happy proportion equals 0.15, 0.3, 0.4, 0.5, 0.6, 0.7, 0.85, 1.0). The original images of three adaptors were obtained from the KDEF database [53] (http://www.emotionlab.se/resources/kdef, accessed on 5 September 2018). All test images utilized in the experiment were created by WebMorph (STOIKimage, https://webmorph.org, accessed on 5 September 2018) according to the three adaptors. (B) General experimental procedure outline and single trial procedure. The orange box represented the program before scanning. Blue and green boxes represented the first and second runs with and without 30 s of preadaptation under each adaptor condition, respectively.
Figure 1. Stimuli and experimental design. (A) Three adapting images (i.e., sad, neutral, and happy adaptors) and eight morphed test images (i.e., happy proportion equals 0.15, 0.3, 0.4, 0.5, 0.6, 0.7, 0.85, 1.0). The original images of three adaptors were obtained from the KDEF database [53] (http://www.emotionlab.se/resources/kdef, accessed on 5 September 2018). All test images utilized in the experiment were created by WebMorph (STOIKimage, https://webmorph.org, accessed on 5 September 2018) according to the three adaptors. (B) General experimental procedure outline and single trial procedure. The orange box represented the program before scanning. Blue and green boxes represented the first and second runs with and without 30 s of preadaptation under each adaptor condition, respectively.
Brainsci 14 00257 g001
Figure 2. The emotional adaptation aftereffect in behavior. (A) The fraction of happy responses is plotted as a function of the proportion of happiness of the test images, separately for the happy (R2: 0.98 ± 0.03), neutral (R2: 0.99 ± 0.01), and sad (R2: 0.97 ± 0.04) adaptor conditions; (B) the mean PSE shift of the happy and sad adaptor conditions from the neutral adaptor condition for all participants. Error bars represent standard errors. *** p < 0.001.
Figure 2. The emotional adaptation aftereffect in behavior. (A) The fraction of happy responses is plotted as a function of the proportion of happiness of the test images, separately for the happy (R2: 0.98 ± 0.03), neutral (R2: 0.99 ± 0.01), and sad (R2: 0.97 ± 0.04) adaptor conditions; (B) the mean PSE shift of the happy and sad adaptor conditions from the neutral adaptor condition for all participants. Error bars represent standard errors. *** p < 0.001.
Brainsci 14 00257 g002
Figure 3. Brain regions showing different neural activations across the three types of adaptor conditions (i.e., happy, neutral, and sad) during emotion judgment on the neutral test images. All activations were whole-brain-corrected and -thresholded at Z > 2.6. R = right.
Figure 3. Brain regions showing different neural activations across the three types of adaptor conditions (i.e., happy, neutral, and sad) during emotion judgment on the neutral test images. All activations were whole-brain-corrected and -thresholded at Z > 2.6. R = right.
Brainsci 14 00257 g003
Figure 4. The percent signal changes of the three perception-based conditions in 4 ROIs. * p < 0.05, ** p < 0.01, and *** p < 0.001. Abbreviations: AG, angular gyrus; IFGpo, pars opercularis of inferior frontal gyrus; FFA, fusiform face area.
Figure 4. The percent signal changes of the three perception-based conditions in 4 ROIs. * p < 0.05, ** p < 0.01, and *** p < 0.001. Abbreviations: AG, angular gyrus; IFGpo, pars opercularis of inferior frontal gyrus; FFA, fusiform face area.
Brainsci 14 00257 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Su, X.; Fu, R.; Li, H.; Jiang, N.; Li, A.; Yang, J.; Mei, L. The Left Amygdala and Right Frontoparietal Cortex Support Emotional Adaptation Aftereffects. Brain Sci. 2024, 14, 257. https://doi.org/10.3390/brainsci14030257

AMA Style

Su X, Fu R, Li H, Jiang N, Li A, Yang J, Mei L. The Left Amygdala and Right Frontoparietal Cortex Support Emotional Adaptation Aftereffects. Brain Sciences. 2024; 14(3):257. https://doi.org/10.3390/brainsci14030257

Chicago/Turabian Style

Su, Xinqi, Ruilin Fu, Huiling Li, Nan Jiang, Aqian Li, Jingyu Yang, and Leilei Mei. 2024. "The Left Amygdala and Right Frontoparietal Cortex Support Emotional Adaptation Aftereffects" Brain Sciences 14, no. 3: 257. https://doi.org/10.3390/brainsci14030257

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop