The Development of a Multi-Modality Emotion Recognition Test Presented via a Mobile Application
Abstract
:1. Introduction
1.1. Emotion Recognition
1.2. Measurement of the Emotion Recognition
1.2.1. Single-Modal Emotion Recognition Tests
1.2.2. Multi-Modal Emotion Recognition Tests
1.2.3. The Unmet Need for, and Challenges of, the Development of the Emotion Recognition Test
Test Name | Country | Number of Items | Test Methods or Procedures | Reliability Validity |
---|---|---|---|---|
Japanese and Caucasian Brief Affect Recognition Test (JACBART) [22] | Japan | 56 a | Rate the degree of the 7 emotions | Internal reliability Test–retest Reliability Concurrent validity Convergent validity |
The Chinese Facial Emotion Recognition Database (CFERD) [23] | Taiwan | 100 a | 7 emotions classification | NA |
NA [32] | Malaysia | 56 a | 7 emotions classification | NA |
Chinese Affective Picture System (CAPS) [33] | China | 60 a | 4 emotions classification | NA |
Test Name | Country | Number of Items | Test Content | Test Methods or Procedures | Reliability Validity |
---|---|---|---|---|---|
Florida Affect Battery [24] | US | 232 a,b,c | Similar to MMER | 5 emotions classification | Test–retest 0.89–0.97 |
The Awareness of Social Inference Test (TASIT)—part 1: Emotion Evaluation Test (EET) [27] | Australia | 28 e | Similar to subtest 3 | 7 emotions classification | Parallel forms Reliability Construct validity |
Multimodal Emotion Recognition Test (MERT) [28] | Switzerland | 120 a,b,d,e | Similar to subtest 3 | 10 emotions classification | Interrater 0.38 Test–retest 0.78 EFA |
Florida Affect Battery (Chinese version) [26] | Taiwan | 225 a,b,c (Western facial stimulus; Chinese prosody stimulus) | Similar to MMER | 5 emotions classification | Content validity Criterion validity Norm comparison |
Geneva Emotion Recognition Test (GERT) [29] | Switzerland | 108 e | Similar to subtest 3 | 14 emotions classification | NA |
Geneva Emotion Recognition Test—short form (GERT-S) [30] | Switzerland | 42 e | Similar to subtest 3 | 14 emotions classification | Cronbach alpha = 0.80 CFA good fit |
Emotion Recognition Assessment in Multiple Modalities Test (ERAM) [31] | Sweden | 72 a,b,c | Similar to subtest 3 | 12 emotions classification | Cronbach alpha = 0.74 CFA good fit |
1.3. Aim
2. Materials and Methods
2.1. Participants
2.2. Procedure of the Development of the MMER App
2.2.1. Item Generation of the MMER App
2.2.2. The Subtests of the MMER App
2.2.3. The Pretest Stage
2.2.4. The MMER App
2.3. Measurement
2.4. Data Analysis
3. Results
3.1. Performance
3.2. Reliability, Confirmatory Factor Analyses, and Criterion-Related Validity
3.2.1. Reliability
3.2.2. Confirmatory Factor Analysis
3.2.3. Criterion-Related Validity
3.3. Multiple Stepwise Regression for Application
MMER app total score-4.007(gender-0.33)-0.750(age-48.28) + 1.645(education-13.73)
where gender was defined as male = 1 and female = 0.
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Schirmer, A. Is the voice an auditory face? An ALE meta-analysis comparing vocal and facial emotion processing. Soc. Cogn. Affect. Neurosci. 2018, 13, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Frith, C. Role of facial expressions in social interactions. Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 3453–3458. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mier, D.; Lis, S.; Neuthe, K.; Sauer, C.; Esslinger, C.; Gallhofer, B.; Kirsch, P. The involvement of emotion recognition in affective theory of mind. Psychophysiology 2010, 47, 1028–1039. [Google Scholar] [CrossRef] [PubMed]
- Leppanen, J.M.; Hietanen, J.K. Emotion recognition and social adjustment in school-aged girls and boys. Scand. J. Psychol. 2001, 42, 429–435. [Google Scholar] [CrossRef] [PubMed]
- Ruffman, T.; Henry, J.; Livingstone, V.; Phillips, L.H. A meta-analytic review of emotion recognition and aging: Implications for neuropsychological models of aging. Neurosci. Biobehav. Rev. 2008, 32, 863–881. [Google Scholar] [CrossRef]
- Sullivan, S.; Ruffman, T.; Hutton, S.B. Age Differences in Emotion Recognition Skills and the Visual Scanning of Emotion Faces. J. Gerontol. Ser. B Psychol. Sci. Soc. Sci. 2007, 62, P53–P60. [Google Scholar] [CrossRef] [Green Version]
- Calder, A.J.; Keane, J.; Manly, T.; Sprengelmeyer, R.H.; Scott, S.; Nimmo-Smith, I.; Young, A.W. Facial expression recognition across the adult life span. Neuropsychologia 2003, 41, 195–202. [Google Scholar] [CrossRef]
- Hargrave, R.; Maddock, R.J.; Stone, V. Impaired recognition of facial expressions of emotion in Alzheimer’s disease. J. Neuropsychiatry Clin. Neurosci. 2002, 14, 64–71. [Google Scholar] [CrossRef]
- Yu, R.-L.; Chen, P.S.; Tu, S.-C.; Tsao, W.-C.; Tan, C.-H. Emotion-Specific Affective Theory of Mind Impairment in Parkinson’s Disease. Sci. Rep. 2018, 8, 16043. [Google Scholar] [CrossRef]
- Sachse, M.; Schlitt, S.; Hainz, D.; Ciaramidaro, A.; Walter, H.; Poustka, F.; Bölte, S.; Freitag, C.M. Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder. Schizophr. Res. 2014, 159, 509–514. [Google Scholar] [CrossRef]
- Valdespino, A.; Antezana, L.; Ghane, M.; Richey, J.A. Alexithymia as a Transdiagnostic Precursor to Empathy Abnormalities: The Functional Role of the Insula. Front. Psychol. 2017, 8, 2234. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Haxby, J.V.; Hoffman, E.A.; Gobbini, M. Human neural systems for face recognition and social communication. Biol. Psychiatry 2002, 51, 59–67. [Google Scholar] [CrossRef]
- Mitchell, R.L.; Phillips, L. The overlapping relationship between emotion perception and theory of mind. Neuropsychologia 2015, 70, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Adolphs, R. Recognizing Emotion from Facial Expressions: Psychological and Neurological Mechanisms. Behav. Cogn. Neurosci. Rev. 2002, 1, 21–62. [Google Scholar] [CrossRef] [Green Version]
- Calder, A.J.; Keane, J.; Manes, F.; Antoun, N.; Young, A.W. Impaired recognition and experience of disgust following brain injury. Nat. Neurosci. 2000, 3, 1077–1078. [Google Scholar] [CrossRef]
- Dodich, A.; Cerami, C.; Canessa, N.; Crespi, C.; Marcone, A.; Arpone, M.; Realmuto, S.; Cappa, S. Emotion recognition from facial expressions: A normative study of the Ekman 60-Faces Test in the Italian population. Neurol. Sci. 2014, 35, 1015–1021. [Google Scholar] [CrossRef]
- Nowicki, S.; Duke, M.P. Nonverbal Receptivity: The Diagnostic Analysis of Nonverbal Accuracy (DANVA); Erlbaum Associates: Mahwah, NJ, USA, 2001. [Google Scholar]
- Ekman, P.; Friesen, W.V.; O’sullivan, M.; Chan, A.; Diacoyanni-Tarlatzis, I.; Heider, K.; Krause, R.; LeCompte, W.A.; Pitcairn, T.; Ricci-Bitti, P.E.; et al. Universals and cultural differences in the judgments of facial expressions of emotion. J. Personal. Soc. Psychol. 1987, 53, 712. [Google Scholar] [CrossRef]
- Jack, R.E.; Garrod, O.G.B.; Yu, H.; Caldara, R.; Schyns, P.G. Facial expressions of emotion are not culturally universal. Proc. Natl. Acad. Sci. USA 2012, 109, 7241–7244. [Google Scholar] [CrossRef] [Green Version]
- Jack, R.E.; Blais, C.; Scheepers, C.; Schyns, P.G.; Caldara, R. Cultural Confusions Show that Facial Expressions Are Not Universal. Curr. Biol. 2009, 19, 1543–1548. [Google Scholar] [CrossRef] [Green Version]
- Jack, R.E.; Caldara, R.; Schyns, P.G. Internal representations reveal cultural diversity in expectations of facial expressions of emotion. J. Exp. Psychol. Gen. 2012, 141, 19–25. [Google Scholar] [CrossRef] [Green Version]
- Matsumoto, D.; LeRoux, J.; Wilson-Cohn, C.; Raroque, J.; Kooken, K.; Ekman, P.; Yrizarry, N.; Loewinger, S.; Uchida, H.; Yee, A.; et al. A New Test to Measure Emotion Recognition Ability: Matsumoto and Ekman’s Japanese and Caucasian Brief Affect Recognition Test (JACBART). J. Nonverbal Behav. 2000, 24, 179–209. [Google Scholar] [CrossRef]
- Huang, C.L.-C.; Hsiao, S.; Hwu, H.-G.; Howng, S.-L. The Chinese Facial Emotion Recognition Database (CFERD): A computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities. Psychiatry Res. 2012, 200, 928–932. [Google Scholar] [CrossRef] [PubMed]
- Bowers, D.; Blonder, L.; Heilman, K. Florida Affect Battery; Center for Neuropsychological Studies, Department of Neurology: Gainesville, FL, USA, 1998. [Google Scholar]
- Sedda, A.; Petito, S.; Guarino, M.; Stracciari, A. Identification and intensity of disgust: Distinguishing visual, linguistic and facial expressions processing in Parkinson disease. Behav. Brain Res. 2017, 330, 30–36. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.C.; Yeh, Z.T.; Tsai, M.C. The Validity of the Florida Affect Battery Test and Performance in Patients with Prefrontal Cortex Damage. J. Ment. Health 2012, 25, 299–334. [Google Scholar]
- McDonald, S.; Flanagan, S.; Rollins, J.; Kinch, J. TASIT: A New Clinical Tool for Assessing Social Perception After Traumatic Brain Injury. J. Head Trauma Rehabil. 2003, 18, 219–238. [Google Scholar] [CrossRef]
- Bänziger, T.; Grandjean, D.; Scherer, K.R. Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT). Emotion 2009, 9, 691–704. [Google Scholar] [CrossRef] [Green Version]
- Schlegel, K.; Grandjean, D.; Scherer, K.R. Introducing the Geneva Emotion Recognition Test: An example of Rasch-based test development. Psychol. Assess. 2014, 26, 666–672. [Google Scholar] [CrossRef] [Green Version]
- Schlegel, K.; Scherer, K.R. Introducing a short version of the Geneva Emotion Recognition Test (GERT-S): Psychometric properties and construct validation. Behav. Res. Methods 2016, 48, 1383–1392. [Google Scholar] [CrossRef]
- Laukka, P.; Bänziger, T.; Israelsson, A.; Cortes, D.S.; Tornberg, C.; Scherer, K.R.; Fischer, H. Investigating individual differences in emotion recognition ability using the ERAM test. Acta Psychol. 2021, 220, 103422. [Google Scholar] [CrossRef]
- Tan, C.B.; Sheppard, E.; Stephen, I.D. A change in strategy: Static emotion recognition in Malaysian Chinese. Cogent Psychol. 2015, 2, 1085941. [Google Scholar] [CrossRef]
- Xia, M.; Li, X.; Zhong, H.; Li, H. Fixation Patterns of Chinese Participants while Identifying Facial Expressions on Chinese Faces. Front. Psychol. 2017, 8, 8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Folstein, M.F.; Folstein, S.E.; McHugh, P.R. “Mini-mental state”. A practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res. 1975, 12, 189–198. [Google Scholar] [CrossRef]
- Cheng, C.M.; Chen, H.C.; Chan, Y.C.; Su, Y.C.; Tseng, C.C. Taiwan corpora of Chinese emotions and relevant psychophysiological data—Normative Data for Chinese Jokes. Chin. J. Psychol. 2013, 55, 555–569. [Google Scholar]
- Chiou, B.-C.; Chen, C.-P. Speech emotion recognition with cross-lingual databases. In Fifteenth Annual Conference of the International Speech Communication Association; ISCA: Baixas, France, 2014. [Google Scholar]
- Huang, S.T.; Lee, M.C.; Lee, L.; Chan, Y.T.; Tsai, H.T. Taiwan corpora of Chinese emotional stimuli database—The study of emotional prosody. Chin. J. Psychol. 2014, 56, 383–398. [Google Scholar]
- Baron-Cohen, S.; Wheelwright, S.; Hill, J.; Raste, Y.; Plumb, I. The “Reading the Mind in the Eyes” test revised version: A study with normal adults, and adults with Asperger syndrome or high-functioning autism. J. Child Psychol. Psychiatry 2001, 42, 241–251. [Google Scholar] [CrossRef] [PubMed]
- Waaramaa, T. Gender differences in identifying emotions from auditory and visual stimuli. Logop. Phoniatr. Vocology 2016, 42, 160–166. [Google Scholar] [CrossRef]
- Scherer, K.R.; Johnstone, T.; Klasmeyer, G. Vocal Expression of Emotion; Oxford University Press: Oxford, UK, 2003. [Google Scholar]
- Blair, R.J.R. Fine Cuts of Empathy and the Amygdala: Dissociable Deficits in Psychopathy and Autism. Q. J. Exp. Psychol. 2008, 61, 157–170. [Google Scholar] [CrossRef]
- Abbruzzese, L.; Magnani, N.; Robertson, I.H.; Mancuso, M. Age and Gender Differences in Emotion Recognition. Front. Psychol. 2019, 10, 2371. [Google Scholar] [CrossRef] [Green Version]
- Trauffer, N.M.; Widen, S.C.; Russell, J.A. Education and the attribution of emotion to facial expressions. Psihol. Teme 2013, 22, 237–247. [Google Scholar]
Demographic Variables | Mean | Range | ||
Gender (male %) | 32.54 (46.99%) | - | ||
Age, years | 48.28 (20.27 +) | 18–80 | ||
Education, years | 13.73 (2.97 +) | 6–20 | ||
Subtests of the MMER App | Full Mark | Correct Score Mean (SD) | Correct Score Range | Accuracy Mean (SD) |
Total score | 198 | 135.88 (22.14) | 71.00–174.00 | 0.68 (0.11) |
Subtest 1 | 24 | 22.42 (2.09) | 13.00–24.00 | 0.93 (0.09) |
Subtest 3 | 30 | 22.14 (4.08) | 10.00–29.00 | 0.74 (0.14) |
Subtest 4 | 20 | 15.74 (2.82) | 6.00–20.00 | 0.79 (0.14) |
Subtest 5 | 26 | 20.07 (3.96) | 3.00–26.00 | 0.77 (0.15) |
Face-related subtests a | 100 | 57.95 (9.85) | 24.00–74.00 | 0.76 (0.13) |
Subtest 7 (prosody-related subtest) | 25 | 11.12 (3.90) | 3.00–21.00 | 0.44 (0.16) |
Subtest 8 | 35 | 20.62 (5.19) | 5.00–30.00 | 0.59 (0.15) |
Subtest 9 | 38 | 23.76 (5.25) | 8.00–35.00 | 0.63 (0.14) |
Face–prosody subtest b | 73 | 44.38 (9.63) | 22.00–65.00 | 0.61 (0.13) |
Emotion | Full Mark | Correct Score Mean (SD) | Correct Score Range | Accuracy Mean (SD) |
---|---|---|---|---|
Total score | 174 | 113.46 (21.58) | 49.00–151.00 | 0.66 (0.12) |
Neutral | 25 | 17.20 (4.12) | 4.00–25.00 | 0.69 (0.16) |
Happiness | 21 | 15.89 (2.61) | 9.00–21.00 | 0.76 (0.12) |
Sadness | 23 | 15.91 (4.02) | 4.00–22.00 | 0.69 (0.17) |
Angry | 25 | 20.17 (3.42) | 9.00–25.00 | 0.84 (0.14) |
Disgust | 29 | 15.78 (4.43) | 4.00–24.00 | 0.54 (0.15) |
Fear | 29 | 13.51 (4.60) | 4.00–23.00 | 0.47 (0.16) |
Surprise | 22 | 15.01 (3.43) | 3.00–21.00 | 0.68 (0.16) |
Version and subtests of the MMER app | ||||
Face-related subtests a | 76 | |||
Neutral | 10 | 7.88 (1.69) | 2.00–10.00 | 0.79 (0.17) |
Happiness | 8 | 7.67 (0.72) | 3.00–8.00 | 0.96 (0.09) |
Sadness | 10 | 7.33 (1.75) | 2.00–10.00 | 0.73 (0.17) |
Angry | 12 | 10.47 (1.62) | 4.00–12.00 | 0.87 (0.13) |
Disgust | 14 | 10.76 (2.98) | 1.00–14.00 | 0.77 (0.21) |
Fear | 14 | 6.90 (3.05) | 1.00–14.00 | 0.49 (0.22) |
Surprise | 8 | 6.95 (1.21) | 1.00–8.00 | 0.87 (0.15) |
Prosody-related subtest (subtest 7) | 25 | |||
Neutral | 4 | 2.27 (1.17) | 0.00–4.00 | 0.57 (0.29) |
Happiness | 3 | 1.33 (0.92) | 0.00–3.00 | 0.44 (0.31) |
Sadness | 4 | 2.21 (1.23) | 0.00–4.00 | 0.55 (0.31) |
Angry | 2 | 1.59 (0.59) | 0.00–2.00 | 0.80 (0.30) |
Disgust | 4 | 0.93 (0.92) | 0.00–3.00 | 0.23 (0.23) |
Fear | 4 | 1.33 (1.03) | 0.00–4.00 | 0.33 (0.26) |
Surprise | 4 | 1.46 (0.97) | 0.00–4.00 | 0.37 (0.24) |
Face–prosody subtest b | 73 | |||
Neutral | 11 | 7.05 (2.43) | 1.00–11.00 | 0.64 (0.22) |
Happiness | 10 | 6.88 (1.83) | 2.00–10.00 | 0.69 (0.18) |
Sadness | 9 | 6.37 (1.88) | 1.00–9.00 | 0.71 (0.21) |
Angry | 11 | 8.11 (2.12) | 1.00–11.00 | 0.74 (0.19) |
Disgust | 11 | 4.08 (1.87) | 0.00–9.00 | 0.37 (0.17) |
Fear | 11 | 5.29 (1.79) | 1.00–9.00 | 0.48 (0.16) |
Surprise | 10 | 6.60 (2.09) | 1.00–10.00 | 0.66 (0.21) |
Model | χ2 | df | RMSEA (90% CI) | CFI | TLI | AIC |
---|---|---|---|---|---|---|
Model 1: Unidimensional | 45.282 | 14 | 0.115 (0.079–0.153) | 0.958 | 0.936 | 7425.075 |
Model 2: Oblique (2 factors) | 12.444 | 13 | <0.001 (0.000–0.074) | 1.000 | 1.001 | 7394.237 |
Model 3: Orthogonal (2 factors) | 174.631 | 14 | 0.261 (0.227–0.296) | 0.782 | 0.673 | 7554.424 |
Model 4: Oblique (3 factors) | 10.787 | 12 | <0.001 (0.000–0.072) | 1.000 | 1.003 | 7394.580 |
Model 5: Orthogonal (3 factors) | 186.673 | 15 | 0.260 (0.228–0.294) | 0.767 | 0.674 | 7564.466 |
Subtests | Factors | ||
---|---|---|---|
Facial Recognition | Facial Emotion Recognition | Prosody Emotion Recognition | |
1 | 1.000 | ||
3 | 0.861 | ||
4 | 0.858 | ||
5 | 0.840 | ||
7 | 0.846 | ||
8 | 0.891 | ||
9 | 0.807 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yu, R.-L.; Poon, S.-F.; Yi, H.-J.; Chien, C.-Y.; Hsu, P.-H. The Development of a Multi-Modality Emotion Recognition Test Presented via a Mobile Application. Brain Sci. 2022, 12, 251. https://doi.org/10.3390/brainsci12020251
Yu R-L, Poon S-F, Yi H-J, Chien C-Y, Hsu P-H. The Development of a Multi-Modality Emotion Recognition Test Presented via a Mobile Application. Brain Sciences. 2022; 12(2):251. https://doi.org/10.3390/brainsci12020251
Chicago/Turabian StyleYu, Rwei-Ling, Shu-Fai Poon, Hsin-Jou Yi, Chia-Yi Chien, and Pei-Hsuan Hsu. 2022. "The Development of a Multi-Modality Emotion Recognition Test Presented via a Mobile Application" Brain Sciences 12, no. 2: 251. https://doi.org/10.3390/brainsci12020251
APA StyleYu, R. -L., Poon, S. -F., Yi, H. -J., Chien, C. -Y., & Hsu, P. -H. (2022). The Development of a Multi-Modality Emotion Recognition Test Presented via a Mobile Application. Brain Sciences, 12(2), 251. https://doi.org/10.3390/brainsci12020251