Next Article in Journal
Ascertaining the Inconsistency of AEC Students’ Perceptions and Behaviors Regarding Sustainability by Mixed Methods
Previous Article in Journal
Transfer of Macronutrients, Micronutrients, and Toxic Elements from Soil to Grapes to White Wines in Uncontaminated Vineyards
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Morphing Task: The Emotion Recognition Process in Children with Attention Deficit Hyperactivity Disorder and Autism Spectrum Disorder

1
Department of Human Neurosciences, Sapienza University of Rome, 00185 Rome, Italy
2
Istituto Neurologico Mediterraneo Neuromed IRCCS, 86077 Pozzilli, Italy
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2021, 18(24), 13273; https://doi.org/10.3390/ijerph182413273
Submission received: 6 October 2021 / Revised: 24 November 2021 / Accepted: 6 December 2021 / Published: 16 December 2021
(This article belongs to the Section Disabilities)

Abstract

:
Recognizing a person’s identity is a fundamental social ability; facial expressions, in particular, are extremely important in social cognition. Individuals affected by autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) display impairment in the recognition of emotions and, consequently, in recognizing expressions related to emotions, and even their identity. The aim of our study was to compare the performance of participants with ADHD, ASD, and typical development (TD) with regard to both accuracy and speed in the morphing task and to determine whether the use of pictures of digitized cartoon faces could significantly facilitate the process of emotion recognition in ASD patients (particularly for disgust). This study investigated the emotion recognition process through the use of dynamic pictures (human faces vs. cartoon faces) created with the morphing technique in three pediatric populations (7–12 years old): ADHD patients, ASD patients, and an age-matched control sample (TD). The Chi-square test was used to compare response latency and accuracy between the three groups in order to determine if there were statistically significant differences (p < 0.05) in the recognition of basic emotions. The results demonstrated a faster response time in neurotypical children compared to ASD and ADHD children, with ADHD participants performing better than ASD participants on the same task. The overall accuracy parameter between the ADHD and ASD groups did not significantly differ.

1. Introduction

Faces are complex stimuli that convey social and affective information; recognizing a person’s identity is a fundamental social ability [1]. In fact, people deduce personality traits from the similarity between the morphological features of a person’s face and emotional expressions [2,3]. Individuals affected by autism spectrum disorder (ASD) display impairment in the recognition of emotions and, consequently, also in the recognition of expressions related to emotions, even their identity.
This is one of the reasons that the practical and clinical applications of automatic emotion recognition have been extensively tested and validated in some neurodevelopmental disorders [4,5,6], particularly in participants with ASD and attention deficit hyperactivity disorder (ADHD).
Facial expression recognition involves dynamic and multimodal phenomena. Facial transformation from a neutral expression sends complex signals, which are converted into emotions [7]. The practical and clinical applications of automatic emotion recognition have been extensively tested and validated [8,9,10,11,12].
ASD is known to be associated with difficulties in using facial expressions to convey emotions and deficits in emotional reciprocity [13,14,15,16,17]. Difficulties in understanding emotions through facial expressions affect a person’s ability to appropriately respond to different situations [18,19,20,21,22]. Since the 1970s, impaired emotion recognition has been described in people with ASD. The six basic emotions (happiness, sadness, fear, disgust, anger, and surprise) identified by Ekman have been investigated [23]. However, emotion recognition findings have been inconsistent to date. Some authors have linked ASD to deficits in the recognition of specific subsets of emotions, which have variously included fear, anger, disgust, and sadness [24,25,26,27], while others observed specific deficits in the recognition of anger or surprise [28,29]. Finally, other studies found no evidence of emotion recognition impairments compared to the general population [30]. It appears that the performance of ASD participants on emotion recognition tasks is influenced by different variables, including age. Some authors found that emotion recognition deficits in children aged 5–7 years decreased in late childhood, but no further improvement has been detected during adolescence or adulthood [31]. A meta-analysis carried out by Lozier and colleagues [32] confirmed that ASD was associated with face-emotion recognition deficits, and that the magnitude of these deficits increased with age.
ADHD is a neurodevelopmental disorder characterized by attention deficit, hyperactivity, and impulsivity (for a more in-depth view, see the Diagnostic and Statistical Manual of Mental Disorders-5 (DSM-5)) [14]. A number of studies found that, in addition to impaired executive functions and problematic behavior [33], emotion recognition deficits associated with interpersonal difficulties may be present, similar to those found in ASD [34].
Children with ADHD are less accepted and often rejected by their peers [35,36,37]. These social difficulties are likely to persist into adulthood. Although some theories assume that emotion recognition deficits are explained by general attentional deficits, increasing evidence suggests that they may actually constitute a distinct impairment. Schönenberg et al. [38] found that individuals with ADHD exhibited impaired recognition of sad and fearful facial expressions, while Schwenck et al. [39] did not find significant differences in emotion recognition between children with ADHD and matched controls. Finally, Jusyte et al. [8] found that, compared to controls, children with ADHD exhibited lower accuracy rates across all basic emotional expressions. Although emotion recognition abilities have been investigated in both ADHD and ASD, few studies have directly compared the performance obtained by individuals with the aforementioned disorders. Furthermore, existing studies have measured expression recognition abilities by employing static images of expressions at their highest intensity. For instance, Berggren et al. [40] used the Frankfurt Test for Facial Affect Recognition to compare matched samples of children with ADHD and ASD. They found that the ADHD group responded faster than the ASD group, but they did not find any difference in accuracy between the two groups. The same test was used by Sinzig et al. [41], who reported facial affect recognition deficits in children suffering from ADHD and in children suffering from both ASD and ADHD when compared to healthy controls. Demopoulos et al. [42] compared the social cognitive profiles of children and adolescents with ASD and ADHD using the Diagnostic Assessment of Nonverbal Accuracy-2 (DANVA-2) to measure effects on facial and vocal identification abilities. Both groups performed significantly worse than the normative sample.
The recognition of facial expression involves dynamic and multimodal phenomena. The transformation of the face from a neutral expression sends complex signals, which are converted into emotions [7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43].
The morphing task has been used since 2001 to evaluate children with ADHD and ASD [44], and it is still used today [45,46,47]. However, studies in the literature have primarily focused on visual updating; studies comparing the two clinical groups (ADHD and ASD) with a control group through human and animated morphing tasks are scarce.
We aimed to investigate the performance of ADHD, ASD, and TD participants with regard to both accuracy and speed in the morphing task and to determine whether the use of dynamic pictures representing cartoon faces could significantly facilitate the process of recognizing emotions in ASD participants.
We hypothesized that children with ADHD would respond faster than ASD participants due to their executive and attention deficits [40], which is in contrast to the findings of Swenck et al. in 2013, who demonstrated no difference in response accuracy or speed between the ADHD and control groups [39]. We also expected that response accuracy would be reduced, particularly for negative emotions [32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48]. Finally, based on the results obtained in 2015 by Brosnan with ASD participants [49], we hypothesized that the use of cartoon faces may facilitate the recognition of emotions as compared to stimuli representing human faces.
Our study aimed to provide a watershed in the understanding of the emotion recognition process. While many studies have investigated this process in ADHD and ASD populations, comparative studies between the two clinical populations are scarce and results have been conflicting, especially when conducted using tasks with dynamic images (representing cartoon faces). Therefore, we used dynamic images of cartoon faces in emotion recognition, with the perspective of using the morphing task as a tool to enhance the expression recognition process at the rehabilitation level.

2. Methods

2.1. Participants

Sixty-two children in the age range 7–12 years were recruited and assigned to one of three groups: (1) an ADHD combined type (ADHD-C) group; (2) an ASD group; and (3) an age-matched control group.
All children and parents recruited for the first two groups were referred to the Department of Human Neurosciences at Policlinico Umberto I clinic in Rome for a follow-up evaluation, which included recruitment to our study between May and September 2019. The 21 children in the ADHD group and the 20 children in the ASD group were included after meeting the DSM-5 diagnostic criteria for ADHD-C or ASD, respectively [14], and after scoring ≥85 on the Full-Scale Intelligence Quotient (FSIQ). Participants were excluded if their first language was not Italian, in order to avoid bias related to linguistic difficulties, or if they had comorbid medical or psychiatric disorders detected through a review of their medical record, in order to avoid the risk of bias due to the associated diagnosis. The control participants were 21 children with typical development (TD) recruited from a primary school in Rome. The inclusion criteria for this group were an age range of 7–12 years and an average score on the Raven’s Coloured Progressive Matrices (CPM) [50] or Standard Progressive Matrices (SPM) [51] (above the 25th percentile). The exclusion criteria were visual or auditory impairment or neurodevelopmental, neurological, or organic disorders, as determined through an interview with the parents. The study was approved by the ethics committee of Sapienza University of Rome and performed in compliance with the Declaration of Helsinki (2000). Written informed consent was obtained from the participants’ parents.

2.2. Morphing Task

All participants were evaluated using a morphing task. The morphing technique is an image processing technique commonly used to metamorphosize from one image to another (e.g., an initial image of a child gradually turns into an image of an adult).
In our study, 24 videoclips with neutral faces gradually developed basic emotions (sadness, anger, surprise, happiness, disgust, and fear). Videoclips were built using FantaMorph (version 5, Abrosoft Co., Eden Prairie, MN, USA), a software used to create morphing images and sophisticated animation effects, including the transformation of the images used.
It should be emphasized that the group of complex emotions (i.e., pride, embarrassment, jealousy) was excluded since these emotions imply the attribution of a cognitive state as well as an emotion and are more dependent on context and culture [52]. They can also be based on belief [53]. Children with TD begin to recognize and verbally label complex emotions, such as embarrassment, pride, and jealousy, at age 7 [54]. Golan et al. [55] suggested that recognition of complex emotions is also impaired in children with ASD.
These videoclips were split into two conditions to compare emotion recognition in human and cartoon faces. These two conditions were chosen on the basis of studies carried out by Jusyte in 2017 [8] and Brosnan et al. in 2015 [49], which analyzed emotion recognition through human animated stimuli (in the first case) and through human and digital animated stimuli (in the second). Our task included a training session and two different conditions (Figure 1 and Figure 2) [56,57].

2.3. Training Session

During the training session, participants watched two videoclips in which a neutral expression turned into a sad or happy expression in order to introduce children to the task. All children passed the initial trial and were then admitted to the experiment.

2.4. First Condition

In the first condition, we presented greyscale images representing 12 young adults, six females and six males, selected from the Cohn-Kanade database [56,57]. We removed potentially distracting details, such as hair or body parts, in order to focus attention on the faces.

2.5. Second Condition

In the second condition, we used colored pictures showing cartoon faces of six females and six males. These images were included in the Facial Expression Research Group Database (FERG-DB) [58]. They were digitally generated using Maya software (version 1, Autodesk, Mill Valley, CA, USA) and the images were created using a 2D renderer (version 7.1.8, Unity, San Francisco, CA, USA) [58].

2.6. Procedure

Participants and their parents were informed about the study procedures and all parents provided written informed consent to participate. The administration room, which was the same for the two clinical groups, was located within the school for the control group. The room had a table with a notebook and two chairs (one for the participant and one for the doctor) and no distracting elements such as posters, billboards, or games.
In the training session, participants were instructed by the child neuropsychiatrist to press a button on the keyboard as soon as a face corresponding to the lexical label appeared on the screen.
Each videoclip lasted 7 s and included 60 frames. We recorded answers and response latency on the appropriate answer sheet. In both conditions, videoclips were presented in the same random order obtained by pseudo-randomization for each child. The experiment was conducted using a Microsoft Office PowerPoint presentation (total time about 20 min), shown on a computer with a 10-inch screen and at a distance of 50 cm.
For statistical analysis, SPSS 19 software (International Business Machines Corporation, Armonk, NY, USA) was used. In order to compare the performance obtained by ADHD, ASD, and TD participants with regard to both accuracy and speed on the morphing task, we performed the following analyses:
  • The Chi-square test (χ2) was performed to investigate potential relationships between the number of errors (as categorial variables) and the six emotions in the two conditions (human and cartoon faces);
  • Multivariate analysis of variance (MANOVA) was used to investigate group differences with regard to response latency (speed in seconds).

3. Results

The clinical participants were 20 children with ADHD combined type (ADHD-C) in the age range 7–12 years old (mean age: 10.12 years; 15 males and 5 females), 21 children with ASD in the age range 7–12 years old (mean age: 10.21 years; 18 males and 3 females), and 21 children in the control group (TD) in the age range 7–12 years old (mean age: 9.33 years; 13 males and 8 females). Demographic characteristics are reported in Table 1.
The Chi-square test was used to compare response accuracy between the three groups in order to determine if there were statistically significant differences (p < 0.05; p < 0.01) in the recognition of basic emotions. In condition 1 (human faces), ASD children exhibited a significantly higher error frequency as compared to TD participants when they were asked to recognize disgusted faces (0.3% ADHD, 0.4% ASD, 0.2%TD; χ2 = 7.612; p = 0.022). With regard to surprise, the ADHD group exhibited the highest error frequency compared to the other two groups (0.6% ADHD, 0.2 % ASD, 0.2% TD; χ2 = 6.025; p = 0.049).
In condition 2 (cartoon faces), we found a significant difference in error frequency distribution relative to the emotion of sadness between the ADHD group and the other two groups (1.0% ADHD, 0.0% ASD, 0.0% TD; χ2 = 6.620; p = 0.037). With regard to disgust, the ADHD group had the highest percentage of errors (0.4% ADHD, 0.3% ASD, 0.2% TD; χ2 = 9.371; p = 0.009).
We compared the response accuracy for the aforementioned emotions between the two clinical groups (ADHD and ASD). The results were similar to those obtained in the previous phases of this study. In particular, in condition 1 (human faces) we found no significant differences between the ADHD and ASD groups in identifying the emotion of disgust, while the error frequency distribution for surprise was significantly higher in the ADHD group (0.8% ADHD, 0.2% ASD; χ2 = 3.881; p = 0.049). In condition 2 (cartoon faces), the percentage of error in identifying the emotion of disgust was higher in the ADHD group relative to the ASD group (0.6% ADHD, 0.4% ASD; χ2 = 4.020; p = 0.045), and the error frequency for sadness was higher in the ADHD group (1.0% ADHD, 0.0% ASD; χ2 = 3.399; p = 0.065).

3.1. Misidentified Emotions

In condition 1, the emotion disgust was more frequently confused with anger in both clinical groups compared to the control group (0.4% ADHD, 0.4% ASD, 0.2% TD; χ2 = 15.899; p = 0.045). The ADHD and TD groups tended to confuse fear and the facial expression of sadness more frequently than the ASD group, while the ASD group more often confused fear and surprise compared to the other two groups (sadness: 0.3% ADHD, 0.1% ASD, 0.5% TD; surprise: 0.0% ADHD, 1.0% ASD, 0.0% TD; χ2 = 22.579; p = 0.032). In condition 2, the ADHD group more frequently confused disgust and anger as compared with the TD group, which confused these emotions less frequently (0.4% ADHD, 0.3% ASD, 0.2% TD; χ2 = 13.592; p = 0.035).

3.2. Response Latency

In order to explore the presence of significant differences between the three groups regarding response latency (parameter speed in seconds), we conducted a multivariate analysis of variance (MANOVA). Wilks’ multivariate Lambda test suggested the existence of statistically relevant differences in response latency between the three groups (λWilks = 0.220, F = 1.701, η2 = 0.531, p = 0.020). To report these results, we report the name of the emotion followed by M (male face) or F (female face) to indicate the gender of the stimulus presented. In the post hoc comparison in condition 1 (Table 2) for the emotion happiness F and disgust M, the TD group responded faster than the ASD and ADHD groups. For the emotions surprise M and F, anger and happiness M, and disgust F, the ASD group responded slower than the ADHD and TD groups. Concerning the emotion fear F, response latency in the ASD group was longer than in the TD group, while the ADHD group responded faster than the ASD group for anger M (Table 2).
In condition 2, the post hoc comparison in Table 3 showed slower response times in the ASD group compared to the other two groups. For the emotions surprise F and sadness M, the ASD group responded slower than only the TD group (Table 3).

4. Discussion

In this study, we investigated the behavior (in terms of speed and accuracy) of ADHD and ASD individuals in response to dynamic emotional images in a morphing task. When compared to the ASD and ADHD groups, children with TD performed better in terms of speed, while the ADHD group performed better than the ASD group. In terms of accuracy, there was no significant difference between the ADHD and ASD groups.
The findings in our analysis confirm previous studies that found a deficit in the emotion recognition process in subjects with ASD and ADHD. Our study shows that the poor performance of the ASD group, particularly for the emotion disgust, is consistent with literature data, especially in regard to Lozier’s meta-analysis [32]. Meta-analyses have mainly involved sets of pictures representing faces and have excluded other kinds of tasks (e.g., cartoon faces). In the study by Brosnan et al. [49], a population of teenagers with ASD was tested using dynamic and static stimuli representing both human and digital faces. Digital stimuli facilitated the ASD group in identifying emotions, both dynamic and static, similar to the control sample. The performance of the ASD group in response to the dynamic and static stimuli of human faces remained low compared to the control group. A similar result was shown in our study. We found that digital stimuli facilitated the ASD group in identifying emotions, such as disgust, compared to both the TD and ADHD groups. Our data differ from Berggen et al. [40], who found a quicker response time for identifying emotions in the ADHD vs. ASD group compared to the control group, and no significant difference with respect to the performance of the TD group.
Specifically, in condition 1 (human faces), the emotion disgust was least recognized by the ASD group, while the emotion surprise was least recognized by the ADHD group. In condition 2 (cartoon faces), the highest error frequency was shown for the emotions sadness and disgust, in both cases by the ADHD group vs. the TD and ASD groups.
These results suggest that digital faces/cartoons may facilitate recognition for participants in these groups. The ASD group made fewer mistakes recognizing disgust, one of the hardest emotions to perceive, in condition 2. This facilitation suggests that the ASD group may be able to use atypical strategies to recognize emotions. This hypothesis originated from previous studies [30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59] in which “exaggerated” facial expressions were perceived as “typical” expressions in static stimuli animated by participants with ASD [60]. Animated stimuli resulted in an extreme representation of emotion in the details of the face, which leads to easier emotion recognition for these participants. The mistakes in emotion recognition were analyzed. The ASD and ADHD groups confused disgust and fear in human faces, and fear was significantly confused with sadness (TD > ADHD > ASD) and surprise (ASD > other groups). In condition 2, disgust was confused with anger (ADHD > TD). These results confirm the emotional profiles of these two disorders and the tendency, mainly in ADHD subjects, for easy recognition of negative emotions, such as anger. Furthermore, ASD participants better recognized emotions in cartoons.
A qualitative observation showed that in our sample there was a lower error rate in cartoon emotion recognition than in human faces. Happiness is the easiest emotion to recognize. Disgust and fear, especially in condition 1, were the most difficult to recognize. This is consistent with literature stating that fear is easier to perceive than recognize. Of the six basic emotions, fear may be defined as the one that conveys the strongest multisensorial signals. These signals, such as environmental threats, may play an important role in decoding emotions [61].
Disgust, after fear, was the most difficult emotion to identify, and was recognized better in the oldest neurotypical participants (similar to fear). Conversely, in ASD, the recognition deficit increased with age, especially for disgust, fear, and sadness [32]. The variability of results for different emotions reflects differences in emotion recognition development by age [62]. For the speed parameter in the “human faces” condition, a longer response latency of the ASD group was shown for both negative and positive emotions. Furthermore, the ADHD group had a shorter response time for anger M compared to participants with TD and ASD. The cartoon condition showed a significantly longer response latency in the ASD group, as compared to the ADHD and TD groups, in negative emotions such as fear M, sadness M/F, anger F, and disgust F, and in only two positive emotions, happiness M and surprise F. These response latency results confirmed the hypothesis that both clinical groups had difficulty in emotion recognition that seems to be related to inherent social difficulties in communication and atypical interactions.
In brief, the difficulty in complex emotion recognition in these participants was evident by:
  • Major latency in emotion recognition for both clinical groups;
  • Greater emotion recognition error rate compared to the control group;
  • Tendency to confuse some emotions (see fear/sadness, anger/disgust).
We hypothesized that there would be an emotion recognition deficit in children with ASD and ADHD compared to the control TD group when using a dynamic stimulus, such as the morphing test. Overall, a faster response time was detected in the group of children with TD compared to the ASD and ADHD groups. The ADHD group, in turn, had a faster response time than the ASD group. Regarding the accuracy of the overall performance, the ADHD and ASD groups did not show a significant difference, and mostly seemed to be superimposable, while the performance of the TD group was qualitatively better. Our data confirm previous studies in the literature. The advantages of the morphing technique have been proven by evidence. Dynamic stimuli may somehow facilitate the emotion recognition process in both groups because a “dynamic picture” impersonalizes human interactions in a more realistic way. In everyday life, facial expressions are rarely transmitted and decoded as static snapshots of internal states [7]. Dynamic faces represent a richer and more valid view of the way in which emotions and facial expressions are identified [60,61,62,63]. The morphing technique is used in many fields, from healing processes and surgery dynamics [59,60,61,62,63,64] to the emotion recognition process in psychiatric pathologies [65]. Our data also show that the use of digitalized faces/cartoons may represent a means to facilitate recognition in ASD participants.
Considering the remarkable facilitation and improved response to dynamic pictures, we hypothesize that the morphing technique could be a valuable aid to reinforce the bases of mind theory and may support the improvement of social reciprocity in ADHD [66] and ASD participants in a therapeutic context.
Several studies have been performed in order to better understand the utility of computer-based interventions (CBIs) in teaching social and emotional skills to subjects with ASD [67,68].
Ramdoss et al. [69], in a systematic review, concluded that CBIs showed mild effects on social and emotional skills in ASD, and that CBIs may represent a promising practice to improve these abilities. However, the authors underlined several limitations, including the heterogeneity of subjects in terms of age and cognitive capacities. Furthermore, the literature reports have stressed the difficulties that individuals with ASD may have in applying the abilities acquired in CBIs to real-life situations, and has suggested combining this approach with a group activity or face-to-face instruction with an adult tutor in order to improve generalization in ASD individuals.
Consistent with these studies, the findings of the present study evidenced the possibility that the morphing task might be a way to simplify emotion recognition in these populations, allowing them to understand the internal states of others, enhance their empathy, and, hypothetically, reduce their gaps in social relationships. In our study, this improvement occurred above all for disgust. However, it can be hypothesized that an improvement for other emotions could be achieved by perfecting the videos through the contextualization of cartoon faces.
This study has some limitations. First, the clinical sample included children of a specific age (7–12 years old). Extending the sample from preschool to adulthood would allow the emotion recognition process to be evaluated throughout life and would allow the accuracy and response latency to be associated with age in these populations, as Richoz showed for the normotypical population [7].
Second, the visual attention parameter (using eye tracking or configurable face processing skills) was not assessed, and consequently the differences in face perception could represent a bias of the study.
Third, all participants included in the study had a pure diagnosis of ASD or ADHD; none had an overlap of the two diagnoses. In addition, all participants had an FISQ of 85 or more. While there is no evidence that these factors influenced the results obtained, we cannot rule out this possibility.
Fourth, the images used in condition 1 were grayscale images of young adults. This could represent a bias with respect to the use of color cartoon images in condition 2. In fact, it has been suggested in the literature, despite the scarcity of studies, that the use of color images of child and adolescent faces may represent a more realistic confirmation of the ability to recognize emotions in the pediatric population.
Additional aspects of interest may be represented by functional Magnetic Resonance Imaging (MRI) studies and the activation of specific brain structures during the emotion recognition process through dynamic tasks [70] and eye-tracking studies with different setting patterns identified [1]. Finally, we used grayscale images in the static task and colored images in the cartoon task, which may have biased our results. The better performance of our participants may be due to the colorfulness of the stimuli. Future studies should evaluate these variables.
This study demonstrates that emotion recognition was facilitated through the use of cartoon faces in ASD and ADHD populations, and that the quality of emotion recognition was comparable between ASD and ADHD participants. This is the first study to compare facial recognition in participants with ADHD, ASD, and TD using a morphing task.

5. Conclusions

The findings of this study may have clinical implications in the planning of alternative strategies for rehabilitation settings for children with neurodevelopmental disabilities. The theoretical implications include a better understanding of the maturational changes underlying social skill deficits. We believe that the morphing task has proven to be useful in behavioral investigations and that its use in a more natural or ecological assessment setting is promising. Finally, these results are important for future research evaluating the emotion recognition process as a therapy goal and the morphing task as a rehabilitation tool in order to improve the social and empathy skills of children with ASD and ADHD.

Author Contributions

Conceptualization, C.G., M.R. and C.S.; Data curation, A.B. and G.G.; Formal analysis, A.B. and G.G.; Investigation, C.G., G.D.V., F.G. and C.S.; Methodology, M.R., A.B., G.G. and C.S.; Project administration, G.D.V., F.G. and M.V.; Software, C.G., G.G. and M.V.; Supervision, M.R., F.G. and C.S.; Writing–original draft, M.R. and C.S.; Writing–review & editing, C.G., M.R., A.B., G.D.V., G.G., F.G., M.V. and C.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008. Ethical review and approval were waived for this study, because this research involved secondary use of clinical data which is provided without any identifier or group of identifiers which would allow attribution of private information to an individual.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon a reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, K.; Anzures, G.; Quinn, P.C.; Pascalis, O.; Ge, L.; Slater, A.M. Development of face processing expertise. In The Oxford Handbook of Face Perception; Calder, A.J., Rhodes, G., Johnson, M.H., Haxby, J.V., Eds.; Oxford University Press: Oxford, UK, 2011; pp. 753–778. [Google Scholar] [CrossRef]
  2. Flowe, H.D. Do Characteristics of Faces That Convey Trustworthiness and Dominance Underlie Perceptions of Criminality? PLoS ONE 2012, 7, e37253. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Hess, U.; Adams, R.R.; Kleck, R.E. Who may frown and who should smile? Dominance, affiliation, and the display of happiness and anger. Cogn. Emot. 2005, 19, 515–536. [Google Scholar] [CrossRef]
  4. Arsalidou, M.; Morris, D.; Taylor, M.J. Converging evidence for the advantage of dynamic facial expressions. Brain Topogr. 2011, 24, 149–163. [Google Scholar] [CrossRef] [PubMed]
  5. Pittella, E.; Fioriello, F.; Maugeri, A.; Rizzuto, E.; Piuzzi, E.; Del Prete, Z.; Sogos, C. Wereable Heart rate monitoring as stress report indicator in children with neurodevelopmental disorders. In Proceedings of the 2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rome, Italy, 11–13 June 2018; pp. 1–5. [Google Scholar] [CrossRef]
  6. Levi, G.; Colonnello, V.; Giacchè, R.; Piredda, M.L.; Sogos, C. Building words on actions: Verb enactment and verb recognition in children with specific language impairment. Res. Dev. Disabil. 2014, 35, 1036–1041. [Google Scholar] [CrossRef] [PubMed]
  7. Richoz, A.R.; Lao, J.; Pascalis, O.; Caldara, R. Tracking the recognition of static and dynamic facial expressions of emotion across the life span. J. Vis. 2018, 18, 5. [Google Scholar] [CrossRef] [Green Version]
  8. Jusyte, A.; Gulewitsch, M.D.; Schönenberg, M. Recognition of peer emotions in children with ADHD: Evidence from an animated facial expressions task. Psychiatry Res. 2017, 258, 351–357. [Google Scholar] [CrossRef] [PubMed]
  9. Golan, O.; Gordon, I.; Fichman, K.; Keinan, G. Specific Patterns of Emotion Recognition from Faces in Children with ASD: Results of a Cross-Modal Matching Paradigm. J. Autism Dev. Disord. 2018, 48, 844–852. [Google Scholar] [CrossRef] [PubMed]
  10. Wang, S.; Adolphs, R. Reduced specificity in emotion judgment in people with autism spectrum disorder. Neuropsychologia 2017, 99, 286–295. [Google Scholar] [CrossRef] [Green Version]
  11. Auletta, A.F.; Cupellaro, S.; Abbate, L.; Aiello, E.; Cornacchia, P.; Norcia, C.; Sogos, C. SCORS-G and Card Pull Effect of TAT Stories: A Study With a Nonclinical Sample of Children. Assessment 2020, 27, 1368–1377. [Google Scholar] [CrossRef]
  12. Levi, G.; Colonnello, V.; Giacchè, R.; Piredda, M.L.; Sogos, C. Grasping the world through words: From action to linguistic production of verbs in early childhood. Dev. Psychobiol. 2014, 56, 510–516. [Google Scholar] [CrossRef] [PubMed]
  13. Garfinkel, S.N.; Tiley, C.; O’Keeffe, S.; Harrison, N.A.; Seth, A.K.; Critchley, H.D. Discrepancies between dimensions of interoception in autism: Implications for emotion and anxiety. Biol. Psychol. 2016, 114, 117–126. [Google Scholar] [CrossRef] [PubMed]
  14. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, 5th ed.; American Psychiatric Association: Washington, DC, USA, 2013. [Google Scholar]
  15. Baron-Cohen, S.; Spitz, A.; Cross, P. Do children with autism recognise surprise? A research note. J. Cogn. Emot. 1993, 7, 507–516. [Google Scholar] [CrossRef]
  16. Fioriello, F.; Maugeri, A.; D’Alvia, L.; Pittella, E.; Piuzzi, E.; Rizzuto, E.; Del Prete, Z.; Manti, F.; Sogos, C. A wearable heart rate measurement device for children with autism spectrum disorder. Sci. Rep. 2020, 10, 18659. [Google Scholar] [CrossRef] [PubMed]
  17. D’Alvia, L.; Pittella, E.; Fioriello, F.; Maugeri, A.; Rizzuto, E.; Piuzzi, E.; Sogos, C.; Del Prete, Z. Heart rate monitoring under stress condition during behavioral analysis in children with neurodevelopmental disorders. In Proceedings of the 2020 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Bari, Italy, 1 June–1 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. Available online: https://ieeexplore.ieee.org/document/9137306/ (accessed on 15 September 2021).
  18. Harmsen, I.E. Empathy in Autism Spectrum Disorder. J. Autism Dev. Disord. 2019, 49, 3939–3955. [Google Scholar] [CrossRef] [PubMed]
  19. Cox, C.L.; Uddin, L.Q.; Di Martino, A.; Castellanos, F.X.; Milham, M.P.; Kelly, C. The balance between feeling and knowing: Affective and cognitive empathy are reflected in the brain’s intrinsic functional dynamics. Soc. Cogn. Affect. Neurosci. 2012, 7, 727–737. [Google Scholar] [CrossRef] [PubMed]
  20. Rogers, K.; Dziobek, I.; Hassenstab, J.; Wolf, O.T.; Convit, A. Who cares? Revisiting empathy in Asperger syndrome. J. Autism Dev. Disord. 2007, 37, 709–715. [Google Scholar] [CrossRef] [PubMed]
  21. Shamay-Tsoory, S.G.; Tomer, R.; Yaniv, S.; Aharon-Peretz, J. Empathy deficits in Asperger syndrome: A cognitive profile. Neurocase 2002, 8, 245–252. [Google Scholar] [CrossRef]
  22. Lord, C.; Elsabbagh, M.; Baird, G.; Veenstra-Vanderweele, J. Autism spectrum disorder. Lancet 2018, 392, 508–520. [Google Scholar] [CrossRef]
  23. Ekman, P. Universal facial expressions of emotions. Calif. Ment. Health Res. Dig. 1970, 8, 151–158. [Google Scholar]
  24. Pelphrey, K.A.; Sasson, N.J.; Reznick, J.S.; Paul, G.; Goldman, B.D.; Piven, J. Visual scanning of faces in autism. J. Autism Dev. Disord. 2002, 32, 249–261. [Google Scholar] [CrossRef] [PubMed]
  25. Ashwin, C.; Chapman, E.; Colle, L.; Baron-Cohen, S. Impaired recognition of negative basic emotions in autism: A test of the amygdala theory. Soc. Neurosci. 2006, 1, 349–363. [Google Scholar] [CrossRef]
  26. Humphreys, K.; Minshew, N.; Leonard, G.L.; Behrmann, M. A fine-grained analysis of facial expression processing in high-functioning adults with autism. Neuropsychologia 2007, 45, 685–695. [Google Scholar] [CrossRef]
  27. Balconi, M.; Amenta, S.; Ferrari, C. Emotional decoding in facial expression, scripts and videos: A comparison between normal, autistic and Asperger children. Res. Autism Spectr. Disord. 2012, 6, 193–203. [Google Scholar] [CrossRef]
  28. Jones, C.R.G.; Pickles, A.; Falcaro, M.; Marsden, A.J.S.; Happé, F.; Scott, S.K.; Sauter, D.; Tregay, J.; Phillips, R.J.; Baird, G.; et al. A multimodal approach to emotion recognition ability in autism spectrum disorders. J. Child Psychol. Psychiatry 2011, 52, 275–285. [Google Scholar] [CrossRef] [Green Version]
  29. Wright, B.; Clarke, N.; Jordan, J.O.; Young, A.W.; Clarke, P.; Miles, J.; Nation, K.; Clarke, L.; Williams, C. Emotion recognition in faces and the use of visual context in young people with high-functioning autism spectrum disorders. Autism 2008, 12, 607–626. [Google Scholar] [CrossRef] [PubMed]
  30. Rutherford, M.D.; McIntosh, D.N. Rules versus prototype matching. Strategies of perception of emotional facial expressions in the autism spectrum. J. Autism Dev. Disord. 2007, 37, 187–196. [Google Scholar] [CrossRef]
  31. Rump, K.M.; Giovanelli, J.L.; Minshew, N.J.; Strauss, M.S. The development of emotion recognition in individuals with autism. Child Dev. 2009, 80, 1434–1447. [Google Scholar] [CrossRef] [Green Version]
  32. Lozier, L.M.; Vanmeter, J.W.; Marsh, A.A. Impairments in facial affect recognition associated with autism spectrum disorders: A meta-analysis. Dev. Psychopathol. 2014, 26, 933–945. [Google Scholar] [CrossRef] [Green Version]
  33. Barkley, R.A.; Murphy, K.R.; Fischer, M. ADHD in Adults: What the Science Says; Guilford Publications: New York, NY, USA, 2008. [Google Scholar]
  34. Pelc, K.; Kornreich, C.; Foisy, M.L.; Dan, B. Recognition of emotional facial expressions in attention-deficit hyperactivity disorder. Pediatr. Neurol. 2006, 35, 93–97. [Google Scholar] [CrossRef]
  35. Hoza, B. Peer functioning in children with ADHD. Ambul. Pediatr. 2007, 7 (Suppl. 1), 101–106. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Hoza, B.; Mrug, S.; Gerdes, A.C.; Hinshaw, S.P.; Bukowski, W.M.; Gold, J.A.; Kraemer, H.C.; Pelham, W.E., Jr.; Wigal, T.; Arnold, L.E. What aspects of peer relationships are impaired in children with attention-deficit/hyperactivity disorder? J. Consult. Clin. Psychol. 2005, 73, 411–423. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Pelham, W.E.; Milich, R. Peer Relations in Children with Hyperactivity/Attention Deficit Disorder. J. Learn. Disabil. 1984, 17, 560–567. [Google Scholar] [CrossRef] [Green Version]
  38. Schönenberg, M.; Schneidt, A.; Wiedemann, E. Processing of dynamic affective information in adults with ADHD. J. Atten. Disord. 2015, 23, 32–39. [Google Scholar] [CrossRef]
  39. Schwenck, C.; Schneider, T.; Schreckenbach, J.; Zenglein, Y.; Gensthaler, A.; Taurines, R.; Freitag, C.M.; Schneider, W.; Romanos, M. Emotion recognition in children and adolescents with attention-deficit/hyperactivity disorder (ADHD). ADHD Atten. Deficit Hyperact. Disord. 2013, 5, 295–302. [Google Scholar] [CrossRef]
  40. Berggren, S.; Engström, A.C.; Bölte, S. Facial affect recognition in autism, ADHD and typical development. Cognit. Neuropsychiatry 2016, 21, 213–227. [Google Scholar] [CrossRef]
  41. Sinzig, J.; Morsch, D.; Lehmkuhl, G. Do hyperactivity, impulsivity and inattention have an impact on the ability of facial affect recognition in children with autism and ADHD? Eur. Child Adolesc. Psychiatry 2008, 17, 63–72. [Google Scholar] [CrossRef] [PubMed]
  42. Demopoulos, C.; Hopkins, J.; Davis, A. A comparison of social cognitive profiles in children with autism spectrum disorders and attention-deficit/hyperactivity disorder: A matter of quantitative but not qualitative difference? J. Autism Dev. Disord. 2013, 43, 1157–1170. [Google Scholar] [CrossRef]
  43. Schwenck, C.; Mergenthaler, J.; Keller, K.; Zech, J.; Salehi, S.; Taurines, R.; Romanos, M.; Schecklmann, M.; Schneider, W.; Warnke, A.; et al. Empathy in children with autism and conduct disorder: Group-specific profiles and developmental aspects. J. Child Psychol. Psychiatry 2012, 53, 651–659. [Google Scholar] [CrossRef]
  44. Teunisse, J.P.; de Gelder, B. Impaired categorical perception of facial expressions in high-functioning adolescents with autism. Child Neuropsychol. 2001, 7, 1–14. [Google Scholar] [CrossRef] [PubMed]
  45. Weber, S.; Falter-Wagner, C.; Stöttinger, E. Brief Report: Typical Visual Updating in Autism. J. Autism Dev. Disord. 2021, 51, 4711–4716. [Google Scholar] [CrossRef] [PubMed]
  46. Lee, K.S.; Chang, D.H.F. Biological motion perception is differentially predicted by Autistic trait domains. Sci. Rep. 2019, 9, 11029. [Google Scholar] [CrossRef] [Green Version]
  47. Han, B.; Tijus, C.; Le Barillier, F.; Nadel, J. Morphing technique reveals intact perception of object motion and disturbed perception of emotional expressions by low-functioning adolescents with Autism Spectrum Disorder. Res. Dev. Disabil. 2015, 47, 393–404. [Google Scholar] [CrossRef]
  48. Boakes, J.; Chapman, E.; Houghton, S.; West, J. Facial affect interpretation in boys with attention deficit/hyperactivity disorder. Child Neuropsychol. 2008, 14, 82–96. [Google Scholar] [CrossRef] [PubMed]
  49. Brosnan, M.; Johnson, H.; Grawmeyer, B.; Chapman, E.; Benton, L. Emotion Recognition in Animated Compared to Human Stimuli in Adolescents with Autism Spectrum Disorder. J. Autism Dev. Disord. 2015, 45, 1785–1796. [Google Scholar] [CrossRef] [Green Version]
  50. Raven, J.C. Coloured Progressive Matrices—CPM; Giunti OS: Firenze, Italy, 2008. [Google Scholar]
  51. Raven, J.C. Standard Progressive Matrices—SPM; Giunti OS: Firenze, Italy, 2008. [Google Scholar]
  52. Griffiths, P. What Emotions Really Are: The Problem of Psychological Categories; University of Chicago Press: Chicago, IL, USA; London, UK, 1997. [Google Scholar]
  53. Harris, P.L. Children and Emotion: The Development of Psychological Understanding; Blackwell: Oxford, UK, 1989. [Google Scholar]
  54. Baron-Cohen, S.; Golan, O.; Wheelwright, S.; Granader, Y.; Hill, J. Emotion word comprehension from 4 to 16 years old: A developmental survey. Front. Evol. Neurosci. 2010, 2, 109. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Golan, O.; Sinai-Gavrilov, Y.; Baron-Cohen, S. The Cambridge Mindreading Face-Voice Battery for Children (CAM-C): Complex emotion recognition in children with and without autism spectrum conditions. Mol. Autism 2015, 6, 22. [Google Scholar] [CrossRef] [Green Version]
  56. Kanade, T.; Cohn, J.F.; Tian, Y. Comprehensive database for facial expression analysis. In Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No, PR00580), Grenoble, France, 28–30 March 2000; pp. 46–53. [Google Scholar] [CrossRef] [Green Version]
  57. Lucey, P.; Cohn, J.F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I. The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition—Workshops, San Francisco, CA, USA, 13–18 June 2010; pp. 94–101. [Google Scholar] [CrossRef] [Green Version]
  58. Aneja, D.; Colburn, A.; Faigin, G.; Shapiro, L.; Mones, B. Modeling Stylized Character Expressions via Deep Learning. In Proceedings of the Asian Conference on Computer Vision, Taipei, Taiwan, 20–24 November 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 136–153. [Google Scholar]
  59. Weber, R.; Keerl, R.; Jaspersen, D.; Huppmann, A.; Schick, B.; Draf, W. Computer-assisted documentation and analysis of wound healing of the nasal and oesophageal mucosa. J. Laryngol. Otol. 1996, 110, 1017–1021. [Google Scholar] [CrossRef] [PubMed]
  60. Trautmann, S.A.; Fehr, T.; Herrmann, M. Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Res. 2009, 1284, 100–115. [Google Scholar] [CrossRef]
  61. Smith, F.W.; Rossit, S. Identifying and detecting facial expressions of emotion in peripheral vision. PLoS ONE 2018, 13, e0197160. [Google Scholar] [CrossRef]
  62. Herba, C.M.; Landau, S.; Russell, T.; Ecker, C.; Phillips, M.L. The development of emotion- processing in children: Effects of age, emotion, and intensity. J. Child Psychol. Psychiatry 2006, 47, 1098–1106. [Google Scholar] [CrossRef]
  63. Johnston, P.; Mayes, A.; Hughes, M.; Young, A.W. Brain networks subserving the evaluation of static and dynamic facial expressions. Cortex 2013, 49, 2462–2472. [Google Scholar] [CrossRef]
  64. D’Haesek, P.F.; Cetinkaya, E.; Konrad, P.E.; Kao, C.; Dawant, B.M. Computer-aided placement of deep brain stimulators: From planning to intraoperative guidance. IEEE Trans. Med. Imaging 2005, 24, 1469–1478. [Google Scholar] [CrossRef]
  65. Collin, L.; Bindra, J.; Raju, M.; Gillberg, C.; Minnis, H. Facial Emotion Recognition in child psychiatry: A Systematic review. Res. Dev. Disabil. 2013, 34, 1505–1520. [Google Scholar] [CrossRef]
  66. Staff, A.I.; Luman, M.; van der Oord, S.; Bergwerff, C.; van den Hoodfdakker, B.; Oosterlaan, J. Facial emotion recognition impairment predicts social and emotional problems in children with (subthreshold) ADHD. Eur. Child Adolesc. Psychiatry 2021, 1–13. [Google Scholar] [CrossRef] [PubMed]
  67. Williams, B.T.; Gray, K.M.; Tonge, B.J. Teaching emotion recognition skills to young children with autism: A randomised controlled trial of an emotion training programme. J. Child Psychol. Psychiatry 2012, 53, 1268–1276. [Google Scholar] [PubMed]
  68. Young, R.L.; Posselt, M. Using the transporters DVD as a learning tool for children with Autism Spectrum Disorders (ASD). J. Autism Dev. Disord. 2012, 42, 984–991. [Google Scholar] [CrossRef] [PubMed]
  69. Ramdoss, S.; Machalicek, W.; Rispoli, M.; Mulloy, A.; Lang, R.; O’Reilly, M. Computer-based interventions to improve social and emotional skills in individuals with autism spectrum disorders: A systematic review. Dev. Neurorehabil. 2012, 15, 119–135. [Google Scholar] [CrossRef] [PubMed]
  70. Fedor, J.; Lynn, A.; Foran, W.; Di Ciccio-Bool, J.; Luna, B.; O’Hearn, K. Pattern of fixation during face recognition: Differences in autism across age. Autism 2018, 22, 866–880. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Example of a frame of a human face without distracting elements (adapted from Fantamorph). ©Jeffrey Cohn, S52.
Figure 1. Example of a frame of a human face without distracting elements (adapted from Fantamorph). ©Jeffrey Cohn, S52.
Ijerph 18 13273 g001
Figure 2. Frames of cartoon faces without distracting elements.
Figure 2. Frames of cartoon faces without distracting elements.
Ijerph 18 13273 g002
Table 1. Demographic characteristics of the included participants: 21 children with typical development (TD), 20 children with attention deficit hyperactivity disorder (ADHD), and 20 children with autism spectrum disorder (ASD). The clinical participants were 20 children with ADHD.
Table 1. Demographic characteristics of the included participants: 21 children with typical development (TD), 20 children with attention deficit hyperactivity disorder (ADHD), and 20 children with autism spectrum disorder (ASD). The clinical participants were 20 children with ADHD.
ASDADHDTD
Mean age (range)9.33 (7–12)10.12 (7–12)10.12 (712)
Female (number (%))8 (38)5 (25)3 (15)
Table 2. Post hoc comparison: diagnostic groups × latency morphing human faces.
Table 2. Post hoc comparison: diagnostic groups × latency morphing human faces.
Feeling of Human FaceGroupsComparisonχ2
ADHD
Mean ± SD
ASD
Mean ± SD
TD
Mean ± SD
ASD vs. TDADHD vs. TD
Happiness_Female Face4.57 ± 1.7645.24 ± 1.443.24 ± 1.31p < 0.01p < 0.050.243
Disgust_Male Face5.09 ± 1.055.23 ± 1.724.14 ± 1.55p < 0.05p < 0.050.104
Surprise_Male Face4.73 ± 1.435.56 ± 0.864.75 ± 1.19p < 0.05Not significant0.103
Anger_Female Face4.71 ± 1.435.83 ± 1.134.54 ± 1.30p < 0.01p < 0.050.173
Fear_Female Face5.66 ± 1.566.62 ± 0.755.31 ± 1.43p < 0.01Not significant0.161
Happiness_Male Face4.01 ± 1.065.09 ± 1.244.22 ± 0.82p < 0.01p < 0.050.170
Anger_Male Face4.86 ± 1.665.95 ± 1.075.14 ± 1.11p < 0.05p < 0.050.115
Surprise_Female Face3.66 ± 1.014.54 ± 1.413.61 ± 0.83p < 0.05Not significant0.136
Disgust_Female Face4.39 ± 1.155.29 ± 1.444.06 ± 1.06p < 0.01p < 0.050.159
Note. ADHD = attention deficit hyperactivity disorder; ASD = autism spectrum disorder; TD = typical development; χ2 = Chi-square test.
Table 3. Post hoc comparison: diagnostic groups × morphing latency cartoon faces.
Table 3. Post hoc comparison: diagnostic groups × morphing latency cartoon faces.
Feeling of Cartoon FaceGroupsComparisonχ2
ADHD
Mean ± SD
ASD
Mean ± SD
TD
Mean ± SD
ASD vs. TDADHD vs. TD
Fear_Male Face3.56 ± 1.494.711 ± 1.8233.13 ± 1.43p < 0.01p < 0.050.157
Sadness_Female Face3.42 ± 1.094.14 ± 1.32.97 ± 0.65p < 0.01p < 0.050.184
Anger_Female Face3.04 ± 0.934.46 ± 2.173.09 ± 1.07p < 0.01Not significant0.167
Happiness_Male Face2.73 ± 0.83.71 ± 22.35 ± 0.49p < 0.01p < 0.050.176
Disgust_Female Face3.57 ± 1.544.7 ± 2.123.07 ± 1.24p < 0.01p < 0.050.149
Surprise _Female Face3.46 ± 1.054.07 ± 1.912.97 ± 0.65p < 0.01p < 0.050.111
Sadness_Male Face3.54 ± 1.44.1 ± 1.912.8 ± 0.542p < 0.01p < 0.050.132
Note ADHD = attention deficit hyperactivity disorder; ASD = autism spectrum disorder; TD = typical development; χ2 = Chi-square test.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Greco, C.; Romani, M.; Berardi, A.; De Vita, G.; Galeoto, G.; Giovannone, F.; Vigliante, M.; Sogos, C. Morphing Task: The Emotion Recognition Process in Children with Attention Deficit Hyperactivity Disorder and Autism Spectrum Disorder. Int. J. Environ. Res. Public Health 2021, 18, 13273. https://doi.org/10.3390/ijerph182413273

AMA Style

Greco C, Romani M, Berardi A, De Vita G, Galeoto G, Giovannone F, Vigliante M, Sogos C. Morphing Task: The Emotion Recognition Process in Children with Attention Deficit Hyperactivity Disorder and Autism Spectrum Disorder. International Journal of Environmental Research and Public Health. 2021; 18(24):13273. https://doi.org/10.3390/ijerph182413273

Chicago/Turabian Style

Greco, Cristina, Maria Romani, Anna Berardi, Gloria De Vita, Giovanni Galeoto, Federica Giovannone, Miriam Vigliante, and Carla Sogos. 2021. "Morphing Task: The Emotion Recognition Process in Children with Attention Deficit Hyperactivity Disorder and Autism Spectrum Disorder" International Journal of Environmental Research and Public Health 18, no. 24: 13273. https://doi.org/10.3390/ijerph182413273

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop