Neurocognitive Dynamics of Prosodic Salience over Semantics during Explicit and Implicit Processing of Basic Emotions in Spoken Words
Abstract
:1. Introduction
1.1. Sensory Dominance Effects: Theoretical Importance and Methodological Concerns
1.2. Effects of Communication Channel on Multi-Stage Processing of Emotional Speech
1.3. Effects of Emotion Category on Emotional Speech Processing
1.4. Effects of Task Type on Emotional Speech Processing
1.5. The Present Study
2. Materials and Methods
2.1. Participants
2.2. Stimuli
2.3. Procedure
2.4. Data Analysis
3. Results
3.1. Auditory Event-Related Potential Measures
3.2. Inter-Trial Phase Coherence Measures
3.3. Event-Related Spectral Perturbation Measures
3.4. Relationships between Auditory ERP and Neural Oscillation Indices
3.5. Behavioral Results
4. Discussion
4.1. Effects of Communication Channels on Emotional Speech Perception
4.2. Effects of Emotion Categories on Emotional Speech Perception
4.3. Effects of Task Types on Emotional Speech Perception
4.4. Neurophysiological and Behavioral Measures of Emotional Speech Perception
4.5. Implications, Limitations and Future Studies
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Filippi, P.; Ocklenburg, S.; Bowling, D.L.; Heege, L.; Güntürkün, O.; Newen, A.; de Boer, B. More than words (and faces): Evidence for a Stroop effect of prosody in emotion word processing. Cogn. Emot. 2017, 31, 879–891. [Google Scholar] [CrossRef] [PubMed]
- Lin, Y.; Ding, H.; Zhang, Y. Prosody dominates over semantics in emotion word processing: Evidence from cross-channel and cross-modal Stroop effects. J. Speech. Lang. Hear. Res. 2020, 63, 896–912. [Google Scholar] [CrossRef] [PubMed]
- Blasi, A.; Mercure, E.; Lloyd-Fox, S.; Thomson, A.; Brammer, M.; Sauter, D.; Deeley, Q.; Barker, G.J.; Renvall, V.; Deoni, S.; et al. Early specialization for voice and emotion processing in the infant brain. Curr. Biol. 2011, 21, 1220–1224. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Graf Estes, K.; Bowen, S. Learning about sounds contributes to learning about words: Effects of prosody and phonotactics on infant word learning. J. Exp. Child Psychol. 2013, 114, 405–417. [Google Scholar] [CrossRef]
- Lima, C.F.; Alves, T.; Scott, S.K.; Castro, S.L. In the ear of the beholder: How age shapes emotion processing in nonverbal vocalizations. Emotion 2014, 14, 145–160. [Google Scholar] [CrossRef] [Green Version]
- Dupuis, K.L.; Pichora-Fuller, M.K. Use of affective prosody by young and older adults. Psychol. Aging 2010, 25, 16–29. [Google Scholar] [CrossRef]
- Picou, E.M. How hearing loss and age affect emotional responses to nonspeech sounds. J. Speech. Lang. Hear. Res. 2016, 59, 1233–1246. [Google Scholar] [CrossRef]
- Zhang, M.; Xu, S.; Chen, Y.; Lin, Y.; Ding, H.; Zhang, Y. Recognition of affective prosody in autism spectrum conditions: A systematic review and meta-analysis. Autism 2022, 26, 798–813. [Google Scholar] [CrossRef]
- Lin, Y.; Ding, H.; Zhang, Y. Emotional prosody processing in schizophrenic patients: A selective review and meta-analysis. J. Clin. Med. 2018, 7, 363. [Google Scholar] [CrossRef] [Green Version]
- Kitayama, S.; Ishii, K. Word and voice: Spontaneous attention to emotional utterances in two languages. Cogn. Emot. 2002, 16, 29–59. [Google Scholar] [CrossRef]
- Ishii, K.; Reyes, J.A.; Kitayama, S. Spontaneous attention to word content versus emotional tone: Differences among three cultures. Psychol. Sci. 2003, 14, 39–46. [Google Scholar] [CrossRef] [PubMed]
- Tanaka, A.; Koizumi, A.; Imai, H.; Hiramatsu, S.; Hiramoto, E.; de Gelder, B. I feel your voice: Cultural differences in the multisensory perception of emotion. Psychol. Sci. 2010, 21, 1259–1262. [Google Scholar] [CrossRef] [PubMed]
- Anolli, L.; Wang, L.; Mantovani, F.; De Toni, A. The Voice of Emotion in Chinese and Italian Young Adults. J. Cross Cult. Psychol. 2008, 39, 565–598. [Google Scholar] [CrossRef]
- Pell, M.D.; Jaywant, A.; Monetta, L.; Kotz, S.A. Emotional speech processing: Disentangling the effects of prosody and semantic cues. Cogn. Emot. 2011, 25, 834–853. [Google Scholar] [CrossRef] [PubMed]
- Hall, E.T. Beyond Culture; Anchor: Hamburg, Germany, 1989. [Google Scholar]
- Wildgruber, D.; Ackermann, H.; Kreifelts, B.; Ethofer, T. Cerebral processing of linguistic and emotional prosody: fMRI studies. Prog. Brain Res. 2006, 156, 249–268. [Google Scholar]
- Castelluccio, B.C.; Myers, E.B.; Schuh, J.M.; Eigsti, I.M. Neural substrates of processing anger in language: Contributions of prosody and semantics. J. Psycholinguist. Res. 2016, 45, 1359–1367. [Google Scholar] [CrossRef] [PubMed]
- Adolphs, R. Neural systems for recognizing emotion. Curr. Opin. Neurobiol. 2002, 12, 169–177. [Google Scholar] [CrossRef]
- Buchanan, T.W.; Lutz, K.; Mirzazade, S.; Specht, K.; Shah, N.J.; Zilles, K.; Jäncke, L. Recognition of emotional prosody and verbal components of spoken language: An fMRI study. Cogn. Brain Res. 2000, 9, 227–238. [Google Scholar] [CrossRef]
- Wegrzyn, M.; Herbert, C.; Ethofer, T.; Flaisch, T.; Kissler, J. Auditory attention enhances processing of positive and negative words in inferior and superior prefrontal cortex. Cortex 2017, 96, 31–45. [Google Scholar] [CrossRef] [Green Version]
- Paulmann, S.; Kotz, S.A. Early emotional prosody perception based on different speaker voices. Cogn. Neurosci. Neuropsychol. 2008, 19, 209–213. [Google Scholar] [CrossRef]
- Fuentemilla, L.; Marco-Pallares, J.; Grau, C. Modulation of spectral power and of phase resetting of EEG contributes differentially to the generation of auditory event-related potentials. Neuroimage 2006, 30, 909–916. [Google Scholar] [CrossRef] [PubMed]
- Yu, L.; Wang, S.; Huang, D.; Wu, X.; Zhang, Y. Role of inter-trial phase coherence in atypical auditory evoked potentials to speech and nonspeech stimuli in children with autism. Clin. Neurophysiol. 2018, 129, 1374–1382. [Google Scholar] [CrossRef]
- Makeig, S.; Debener, S.; Onton, J.; Delorme, A. Mining event-related brain dynamics. Trends Cogn. Sci. 2004, 8, 204–210. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bishop, D.V.; Anderson, M.; Reid, C.; Fox, A.M. Auditory development between 7 and 11 years: An event-related potential (ERP) study. PLoS ONE 2011, 6, e18993. [Google Scholar] [CrossRef]
- Chen, F.; Zhang, H.; Ding, H.; Wang, S.; Peng, G.; Zhang, Y. Neural coding of formant-exaggerated speech and nonspeech in children with and without autism spectrum disorders. Autism Res. 2021, 14, 1357–1374. [Google Scholar] [CrossRef]
- Edwards, E.; Soltani, M.; Kim, W.; Dalal, S.S.; Nagarajan, S.S.; Berger, M.S.; Knight, R.T. Comparison of time-frequency responses and the event-related potential to auditory speech stimuli in human cortex. J. Neurophysiol. 2009, 102, 377–386. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Koerner, T.K.; Zhang, Y. Effects of background noise on inter-trial phase coherence and auditory N1-P2 responses to speech stimuli. Hear. Res. 2015, 328, 113–119. [Google Scholar] [CrossRef]
- Koerner, T.K.; Zhang, Y.; Nelson, P.B.; Wang, B.; Zou, H. Neural indices of phonemic discrimination and sentence-level speech intelligibility in quiet and noise: A P3 study. Hear. Res. 2017, 350, 58–67. [Google Scholar] [CrossRef]
- Cohen, M.X. Analyzing Neural Time Series Data: Theory and Practice; MIT Press: London, UK, 2014. [Google Scholar]
- Klimesch, W.; Sauseng, P.; Hanslmayr, S.; Gruber, W.; Freunberger, R. Event-related phase reorganization may explain evoked neural dynamics. Neurosci. Biobehav. Rev. 2007, 31, 1003–1016. [Google Scholar] [CrossRef]
- Koerner, T.K.; Zhang, Y.; Nelson, P.B.; Wang, B.; Zou, H. Neural indices of phonemic discrimination and sentence-level speech intelligibility in quiet and noise: A mismatch negativity study. Hear. Res. 2016, 339, 40–49. [Google Scholar] [CrossRef]
- Chen, X.; Pan, Z.; Wang, P.; Zhang, L.; Yuan, J. EEG oscillations reflect task effects for the change detection in vocal emotion. Cogn. Neurodyn. 2015, 9, 351–358. [Google Scholar] [CrossRef] [PubMed]
- Chen, X.; Yang, J.; Gan, S.; Yang, Y. The contribution of sound intensity in vocal emotion perception: Behavioral and electrophysiological evidence. PLoS ONE 2012, 7, e30278. [Google Scholar] [CrossRef] [PubMed]
- Kotz, S.A.; Paulmann, S. Emotion, language, and the brain. Lang. Linguist. Compass 2011, 5, 108–125. [Google Scholar] [CrossRef]
- Paulmann, S.; Kotz, S.A. An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical-sentence context. Brain Lang. 2008, 105, 59–69. [Google Scholar] [CrossRef] [PubMed]
- Schirmer, A.; Kotz, S.A. Beyond the right hemisphere: Brain mechanisms mediating vocal emotional processing. Trends Cogn. Sci. 2006, 10, 24–30. [Google Scholar] [CrossRef] [PubMed]
- Pinheiro, A.P.; Del Re, E.; Mezin, J.; Nestor, P.G.; Rauber, A.; McCarley, R.W.; Goncalves, O.F.; Niznikiewicz, M.A. Sensory-based and higher-order operations contribute to abnormal emotional prosody processing in schizophrenia: An electrophysiological investigation. Psychol. Med. 2012, 43, 603–618. [Google Scholar] [CrossRef] [PubMed]
- Pinheiro, A.P.; Rezaii, N.; Rauber, A.; Liu, T.; Nestor, P.G.; McCarley, R.W.; Goncalves, O.F.; Niznikiewicz, M.A. Abnormalities in the processing of emotional prosody from single words in schizophrenia. Schizophr. Res. 2014, 152, 235–241. [Google Scholar] [CrossRef] [Green Version]
- Paulmann, S.; Seifert, S.; Kotz, S.A. Orbito-frontal lesions cause impairment during late but not early emotional prosodic processing. Soc. Neurosci. 2010, 5, 59–75. [Google Scholar] [CrossRef]
- Pinheiro, A.P.; Galdo-Alvarez, S.; Rauber, A.; Sampaio, A.; Niznikiewicz, M.; Goncalves, O.F. Abnormal processing of emotional prosody in Williams syndrome: An event-related potentials study. Res. Dev. Disabil. 2011, 32, 133–147. [Google Scholar] [CrossRef] [Green Version]
- Diamond, E.; Zhang, Y. Cortical processing of phonetic and emotional information in speech: A cross-modal priming study. Neuropsychologia 2016, 82, 110–122. [Google Scholar] [CrossRef] [Green Version]
- Liu, T.; Pinheiro, A.P.; Deng, G.; Nestor, P.G.; McCarley, R.W.; Niznikiewicz, M.A. Electrophysiological insights into processing nonverbal emotional vocalizations. Neuroreport 2012, 23, 108–112. [Google Scholar] [CrossRef] [PubMed]
- Pinheiro, A.P.; Niznikiewicz, M. Altered attentional processing of happy prosody in schizophrenia. Schizophr. Res. 2019, 206, 217–224. [Google Scholar] [CrossRef] [PubMed]
- Paulmann, S.; Bleichner, M.; Kotz, S.A. Valence, arousal, and task effects in emotional prosody processing. Front. Psychol. 2013, 4, 345. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kotz, S.A.; Paulmann, S. When emotional prosody and semantics dance cheek to cheek: ERP evidence. Brain Res. 2007, 1151, 107–118. [Google Scholar] [CrossRef]
- Gaillard, R.; Del Cul, A.; Naccache, L.; Vinckier, F.; Cohen, L.; Dehaene, S. Nonconscious semantic processing of emotional words modulates conscious access. Proc. Natl. Acad. Sci. USA 2006, 103, 7524–7529. [Google Scholar] [CrossRef] [Green Version]
- Herbert, C.; Junghofer, M.; Kissler, J. Event related potentials to emotional adjectives during reading. Psychophysiology 2008, 45, 487–498. [Google Scholar] [CrossRef]
- Kissler, J.; Herbert, C.; Peyk, P.; Junghofer, M. Buzzwords: Early cortical responses to emotional words during reading. Psychol. Sci. 2007, 18, 475–480. [Google Scholar] [CrossRef]
- Schacht, A.; Sommer, W. Time course and task dependence of emotion effects in word processing. Cogn. Affect. Behav. Neurosci. 2009, 9, 28–43. [Google Scholar] [CrossRef] [Green Version]
- Zhang, D.; He, W.; Wang, T.; Luo, W.; Zhu, X.; Gu, R.; Li, H.; Luo, Y.J. Three stages of emotional word processing: An ERP study with rapid serial visual presentation. Soc. Cogn. Affect. Neurosci. 2014, 9, 1897–1903. [Google Scholar] [CrossRef] [Green Version]
- Lin, Y.; Ding, H.; Zhang, Y. Gender differences in identifying facial, prosodic, and semantic emotions show category- and channel-specific effects mediated by encoder’s gender. J. Speech Lang. Hear. Res. 2021, 64, 2941–2955. [Google Scholar] [CrossRef]
- Lin, Y.; Ding, H.; Zhang, Y. Unisensory and multisensory Stroop effects modulate gender differences in verbal and nonverbal emotion perception. J. Speech. Lang. Hear. Res. 2021, 64, 4439–4457. [Google Scholar] [CrossRef]
- Paulmann, S.; Ott, D.V.; Kotz, S.A. Emotional speech perception unfolding in time: The role of the basal ganglia. PLoS ONE 2011, 6, e17694. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Symons, A.E.; El-Deredy, W.; Schwartze, M.; Kotz, S.A. The functional role of neural oscillations in non-verbal emotional communication. Front. Hum. Neurosci. 2016, 10, 239. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kanske, P.; Plitschka, J.; Kotz, S.A. Attentional orienting towards emotion: P2 and N400 ERP effects. Neuropsychologia 2011, 49, 3121–3129. [Google Scholar] [CrossRef] [PubMed]
- Paulmann, S.; Pell, M.D. Contextual influences of emotional speech prosody on face processing: How much is enough? Cogn. Affect. Behav. Neurosci. 2010, 10, 230–242. [Google Scholar] [CrossRef] [Green Version]
- Liebenthal, E.; Silbersweig, D.A.; Stern, E. The language, tone and prosody of emotions: Neural substrates and dynamics of spoken-word emotion perception. Front. Neurosci. 2016, 10, 506. [Google Scholar] [CrossRef] [Green Version]
- Ackerman, B.P.; Abe, J.A.A.; Izard, C.E. Differential emotions theory and emotional development. In What Develops in Emotional Development? Mascolo, M.F., Griffin, S., Eds.; Springer US: Boston, MA, USA, 1998; pp. 85–106. [Google Scholar]
- Hofmann, M.J.; Kuchinke, L.; Tamm, S.; Vo, M.L.; Jacobs, A.M. Affective processing within 1/10th of a second: High arousal is necessary for early facilitative processing of negative but not positive words. Cogn. Affect. Behav. Neurosci. 2009, 9, 389–397. [Google Scholar] [CrossRef]
- Warriner, A.B.; Kuperman, V. Affective biases in English are bi-dimensional. Cogn. Emot. 2015, 29, 1147–1167. [Google Scholar] [CrossRef]
- Delaney-Busch, N.; Wilkie, G.; Kuperberg, G. Vivid: How valence and arousal influence word processing under different task demands. Cogn. Affect. Behav. Neurosci. 2016, 16, 415–432. [Google Scholar] [CrossRef] [Green Version]
- Wambacq, I.J.A.; Shea-Miller, K.J.; Abubakr, A. Non-voluntary and voluntary processing of emotional prosody: An event-related potentials study. Neuroreport 2004, 15, 555–559. [Google Scholar] [CrossRef]
- Crowley, K.E.; Colrain, I.M. A review of the evidence for P2 being an independent component process: Age, sleep and modality. Clin. Neurophysiol. 2004, 115, 732–744. [Google Scholar] [CrossRef] [PubMed]
- Pinheiro, A.P.; Barros, C.; Dias, M.; Niznikiewicz, M. Does emotion change auditory prediction and deviance detection? Biol. Psychol. 2017, 127, 123–133. [Google Scholar] [CrossRef] [PubMed]
- Näätanen, R.; Teder, W.; Alho, K.; Lavikainen, J. Auditory attention and selective input modulation: A topographical ERP study. Neuroreport 1992, 3, 493–496. [Google Scholar] [CrossRef] [PubMed]
- Erlbeck, H.; Kubler, A.; Kotchoubey, B.; Veser, S. Task instructions modulate the attentional mode affecting the auditory MMN and the semantic N400. Front. Hum. Neurosci. 2014, 8, 654. [Google Scholar] [CrossRef]
- Lenz, D.; Schadow, J.; Thaerig, S.; Busch, N.A.; Herrmann, C.S. What’s that sound? Matches with auditory long-term memory induce gamma activity in human EEG. Int. J. Psychophysiol. 2007, 64, 31–38. [Google Scholar] [CrossRef]
- Ullsperger, P.; Freude, G.; Erdmann, U. Auditory probe sensitivity to mental workload changes—An event-related potential study. Int. J. Psychophysiol. 2001, 40, 201–209. [Google Scholar] [CrossRef]
- Weiss, S.; Mueller, H.M. The contribution of EEG coherence to the investigation of language. Brain Lang. 2003, 85, 325–343. [Google Scholar] [CrossRef]
- Iredale, J.M.; Rushby, J.A.; McDonald, S.; Dimoska-Di Marco, A.; Swift, J. Emotion in voice matters: Neural correlates of emotional prosody perception. Int. J. Psychophysiol. 2013, 89, 483–490. [Google Scholar] [CrossRef]
- Kim, K.H.; Kim, J.H.; Yoon, J.; Jung, K.Y. Influence of task difficulty on the features of event-related potential during visual oddball task. Neurosci. Lett. 2008, 445, 179–183. [Google Scholar] [CrossRef]
- Schirmer, A.; Kotz, S.A.; Friederici, A.D. On the role of attention for the processing of emotions in speech: Sex differences revisited. Cogn. Brain Res. 2005, 24, 442–452. [Google Scholar] [CrossRef]
- Schirmer, A.; Lui, M.; Maess, B.; Escoffier, N.; Chan, M.; Penney, T.B. Task and sex modulate the brain response to emotional incongruity in Asian listeners. Emotion 2006, 6, 406–417. [Google Scholar] [CrossRef] [PubMed]
- Fruhholz, S.; Ceravolo, L.; Grandjean, D. Specific brain networks during explicit and implicit decoding of emotional prosody. Cereb. Cortex 2012, 22, 1107–1117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ekman, P. Are there basic emotions? Psychol. Rev. 1992, 99, 550–553. [Google Scholar] [CrossRef] [PubMed]
- Koerner, T.K.; Zhang, Y. Differential effects of hearing impairment and age on electrophysiological and behavioral measures of speech in noise. Hear. Res. 2018, 370, 130–142. [Google Scholar] [CrossRef]
- Wu, H.; Ma, X.; Zhang, L.; Liu, Y.; Zhang, Y.; Shu, H. Musical experience modulates categorical perception of lexical tones in native Chinese speakers. Front. Psychol. 2015, 6, 436. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Liang, N.; Wang, D.; Zhang, S.; Yang, T.; Jie, C.; Sun, W. A Dictionary of the Frequency of Commonly Used Modern Chinese Words (Alphabetical Sequence Section); Astronautic Publishing House: Beijing, China, 1990. [Google Scholar]
- Boersma, P.; Weenink, D. Praat: Doing Phonetics by Computer, Version 6.1.41; University of Amsterdam: Amsterdam, The Netherlands, 2021. [Google Scholar]
- Psychology Software Tools. E-Prime, Version 2.0; Psychology Software Tools, Inc.: Pittsburgh, PA, USA, 2012. [Google Scholar]
- Chaumon, M.; Bishop, D.V.; Busch, N.A. A practical guide to the selection of independent components of the electroencephalogram for artifact correction. J. Neurosci. Methods 2015, 250, 47–63. [Google Scholar] [CrossRef]
- Pinheiro, A.P.; Rezaii, N.; Nestor, P.G.; Rauber, A.; Spencer, K.M.; Niznikiewicz, M. Did you or I say pretty, rude or brief? An ERP study of the effects of speaker’s identity on emotional word processing. Brain Lang. 2016, 153–154, 38–49. [Google Scholar] [CrossRef]
- Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [Green Version]
- R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020. [Google Scholar]
- Lenth, R.V. Emmeans: Estimated Marginal Means, aka Least-Squares Means, R Package Version 1.4.5; R Foundation for Statistical Computing: Vienna, Austria, 2020. [Google Scholar]
- Castellanos, A.; Martínez-Montes, E.; Hernández-Cabrera, J.; García, L. False discovery rate and permutation test: An evaluation in ERP data analysis. Stat. Med. 2009, 29, 63–74. [Google Scholar] [CrossRef]
- Baayen, R.H.; Milin, P. Analyzing reaction times. Int. J. Psychol. Res. 2010, 3, 12–28. [Google Scholar] [CrossRef]
- Van Petten, C.; Coulson, S.; Rubin, S.; Plante, E.; Parks, M. Time course of word identification and semantic integration. J. Exp. Psychol. Learn. Mem. Cogn. 1999, 25, 394–417. [Google Scholar] [CrossRef] [PubMed]
- Paulmann, S. Chapter 88—The Neurocognition of Prosody. In Neurobiology of Language; Hickok, G., Small, S.L., Eds.; Academic Press: San Diego, CA, USA, 2016; pp. 1109–1120. [Google Scholar]
- Lin, Y.; Ding, H. Effects of communication channels and actor’s gender on emotion identification by native Mandarin speakers. In Proceedings of the Interspeech 2020, Shanghai, China, 25–29 October 2020; pp. 3151–3155. [Google Scholar]
- Liu, P.; Rigoulot, S.; Pell, M.D. Culture modulates the brain response to human expressions of emotion: Electrophysiological evidence. Neuropsychologia 2015, 67, 1–13. [Google Scholar] [CrossRef] [PubMed]
- Lass, N.J.; Hughes, K.R.; Bowyer, M.D.; Waters, L.T.; Bourne, V.T. Speaker sex identification from voiced, whispered, and filtered isolated vowels. J. Acoust. Soc. Am. 1976, 59, 675–678. [Google Scholar] [CrossRef]
- Wambacq, I.J.; Jerger, J.F. Processing of affective prosody and lexical-semantics in spoken utterances as differentiated by event-related potentials. Brain Res. Cogn. Brain Res. 2004, 20, 427–437. [Google Scholar] [CrossRef] [PubMed]
- Grieder, M.; Crinelli, R.M.; Koenig, T.; Wahlund, L.O.; Dierks, T.; Wirth, M. Electrophysiological and behavioral correlates of stable automatic semantic retrieval in aging. Neuropsychologia 2012, 50, 160–171. [Google Scholar] [CrossRef] [Green Version]
- Spreckelmeyer, K.N.; Kutas, M.; Urbach, T.P.; Altenmuller, E.; Munte, T.F. Combined perception of emotion in pictures and musical sounds. Brain Res. 2006, 1070, 160–170. [Google Scholar] [CrossRef]
- Knyazev, G.G.; Slobodskoj-Plusnin, J.Y.; Bocharov, A.V. Event-related delta and theta synchronization during explicit and implicit emotion processing. Neuroscience 2009, 164, 1588–1600. [Google Scholar] [CrossRef]
- Mueller, C.J.; Kuchinke, L. Individual differences in emotion word processing: A diffusion model analysis. Cogn. Affect. Behav. Neurosci. 2016, 16, 489–501. [Google Scholar] [CrossRef] [Green Version]
- Feng, C.; Wang, L.; Liu, C.; Zhu, X.; Dai, R.; Mai, X.; Luo, Y.J. The time course of the influence of valence and arousal on the implicit processing of affective pictures. PLoS ONE 2012, 7, e29668. [Google Scholar] [CrossRef]
- Knyazev, G.G. Motivation, emotion, and their inhibitory control mirrored in brain oscillations. Neurosci. Biobehav. Rev. 2007, 31, 377–395. [Google Scholar] [CrossRef]
- Key, A.P.; Dove, G.O.; Maguire, M.J. Linking brainwaves to the brain: An ERP primer. Dev. Neuropsychol. 2005, 27, 183–215. [Google Scholar] [CrossRef] [PubMed]
- Luo, Y.; Zhang, Y.; Feng, X.; Zhou, X. Electroencephalogram oscillations differentiate semantic and prosodic processes during sentence reading. Neuroscience 2010, 169, 654–664. [Google Scholar] [CrossRef] [PubMed]
- Luck, S.J. An Introduction to the Event-Related Potential Technique; MIT Press: Cambridge, MA, USA, 2014. [Google Scholar]
- van Diepen, R.M.; Mazaheri, A. The Caveats of observing Inter-Trial Phase-Coherence in Cognitive Neuroscience. Sci. Rep. 2018, 8, 2990. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hagen, G.F.; Gatherwright, J.R.; Lopez, B.A.; Polich, J. P3a from visual stimuli: Task difficulty effects. Int. J. Psychophysiol. 2006, 59, 8–14. [Google Scholar] [CrossRef]
- Koerner, T.K.; Zhang, Y. Application of linear mixed-Effects models in human neuroscience research: A comparison with Pearson correlation in two auditory electrophysiology studies. Brain Sci. 2017, 7, 26. [Google Scholar] [CrossRef] [Green Version]
- Thompson, A.E.; Voyer, D. Sex differences in the ability to recognise non-verbal displays of emotion: A meta-analysis. Cogn. Emot. 2014, 28, 1164–1195. [Google Scholar] [CrossRef]
- Mittermeier, V.; Leicht, G.; Karch, S.; Hegerl, U.; Moller, H.J.; Pogarell, O.; Mulert, C. Attention to emotion: Auditory-evoked potentials in an emotional choice reaction task and personality traits as assessed by the NEO FFI. Eur. Arch. Psychiatry Clin. Neurosci. 2011, 261, 111–120. [Google Scholar] [CrossRef]
- Paulmann, S.; Pell, M.D.; Kotz, S.A. How aging affects the recognition of emotional speech. Brain Lang. 2008, 104, 262–269. [Google Scholar] [CrossRef]
- Liu, P.; Rigoulot, S.; Pell, M.D. Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada. Soc. Neurosci. 2017, 12, 685–700. [Google Scholar] [CrossRef]
- Agrawal, D.; Thorne, J.D.; Viola, F.C.; Timm, L.; Debener, S.; Buchner, A.; Dengler, R.; Wittfoth, M. Electrophysiological responses to emotional prosody perception in cochlear implant users. NeuroImage Clin. 2013, 2, 229–238. [Google Scholar] [CrossRef] [Green Version]
- Charpentier, J.; Kovarski, K.; Houy-Durand, E.; Malvy, J.; Saby, A.; Bonnet-Brilhault, F.; Latinus, M.; Gomot, M. Emotional prosodic change detection in autism Spectrum disorder: An electrophysiological investigation in children and adults. J. Neurodev. Disord. 2018, 10, 28. [Google Scholar] [CrossRef] [PubMed]
- Garrido-Vasquez, P.; Pell, M.D.; Paulmann, S.; Strecker, K.; Schwarz, J.; Kotz, S.A. An ERP study of vocal emotion processing in asymmetric Parkinson’s disease. Soc. Cogn. Affect. Neurosci. 2013, 8, 918–927. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hawk, S.T.; van Kleef, G.A.; Fischer, A.H.; van der Schalk, J. “Worth a thousand words”: Absolute and relative decoding of nonlinguistic affect vocalizations. Emotion 2009, 9, 293–305. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jessen, S.; Kotz, S.A. The temporal dynamics of processing emotions from vocal, facial, and bodily expressions. Neuroimage 2011, 58, 665–674. [Google Scholar] [CrossRef] [PubMed]
- Amodio, D.M.; Bartholow, B.D.; Ito, T.A. Tracking the dynamics of the social brain: ERP approaches for social cognitive and affective neuroscience. Soc. Cogn. Affect. Neurosci. 2014, 9, 385–393. [Google Scholar] [CrossRef] [Green Version]
- Ding, H.; Zhang, Y. Speech Prosody in Mental Disorders. Annu. Rev. Linguist. 2023, 9, 17.11–17.21. [Google Scholar] [CrossRef]
Factor | Channel | Emotion | Task | Condition * Emotion | Condition * Channel | Emotion * Channel | Condition * Emotion * Channel | |
---|---|---|---|---|---|---|---|---|
Indice | ||||||||
N100 | χ2 = 58.58 *** | χ2 = 72.23 *** | χ2 = 43.63 *** | χ2 = 9.33 * | n.s. | χ2 = 12.65 ** | χ2 = 13.05 ** | |
P200 | χ2 = 267.71 *** | χ2 = 29.81 *** | χ2 = 324.60 *** | χ2 = 15.49 *** | n.s. | χ2 = 42.86 *** | χ2 = 24.45 *** | |
N400 | χ2 = 99.53 *** | χ2 = 127.02 *** | χ2 = 127.04 *** | χ2 = 7.45 * | χ2 = 62.33 *** | χ2 = 124.44 *** | χ2 = 49.24 *** | |
LPC | χ2 = 242.33 *** | χ2 = 61.53 *** | χ2 = 18.60 *** | χ2 = 97.46 *** | n.s. | n.s. | χ2 = 58.30 *** | |
Accuracy | n.s. | F = 15.79 *** | F = 61.32 *** | F = 11.75 *** | F = 8.55 ** | n.s. | n.s. | |
Reaction time | F = 9.54 ** | n.s. | F = 188.88 *** | n.s. | n.s. | n.s. | n.s. |
Time Window | Frequency Band | Channel | Emotion | Task | Interaction |
---|---|---|---|---|---|
N100 | Delta ITPC | χ2 = 22.07 *** | χ2 = 9.64 * | χ2 = 20.05 *** | No significant two-way or three-way interaction effects were found. |
theta ITPC | χ2 = 24.67 *** | χ2 = 10.65 * | χ2 = 19.87 *** | ||
alpha ITPC | n.s. | n.s. | χ2 = 10.00 * | ||
P200 | delta ITPC | χ2 = 45.06 *** | χ2 = 7.86 * | χ2 = 13.17 *** | |
theta ITPC | χ2 = 29.41 *** | n.s. | χ2 = 16.74 *** | ||
alpha ITPC | χ2 = 13.31 ** | n.s. | χ2 = 6.45 * | ||
N400 | delta ITPC | χ2 = 9.92 * | n.s. | n.s. | |
theta ITPC | n.s. | n.s. | n.s. | ||
alpha ITPC | n.s. | n.s. | n.s. | ||
LPC | delta ITPC | n.s. | n.s. | n.s. | |
theta ITPC | n.s. | n.s. | n.s. | ||
alpha ITPC | n.s. | n.s. | n.s. |
Time Window | Frequency Band | Channel | Emotion | Task | Interaction |
---|---|---|---|---|---|
N100 | Delta ERSP | χ2 = 39.61 *** | n.s. | n.s. | No significant two-way or three-way interaction effects were found. |
theta ERSP | χ2 = 28.04 *** | n.s. | χ2 = 9.20 ** | ||
alpha ERSP | χ2 = 4.41 * | n.s. | n.s. | ||
P200 | delta ERSP | χ2 = 39.74 *** | n.s. | n.s. | |
theta ERSP | χ2 = 36.66 *** | n.s. | n.s. | ||
alpha ERSP | χ2 = 11.16 *** | n.s. | n.s. | ||
N400 | delta ERSP | n.s. | n.s. | χ2 = 10.53 ** | |
theta ERSP | n.s. | n.s. | n.s. | ||
alpha ERSP | n.s. | n.s. | n.s. | ||
LPC | delta ERSP | n.s. | n.s. | n.s. | |
theta ERSP | n.s. | n.s. | n.s. | ||
alpha ERSP | n.s. | n.s. | χ2 = 32.16 *** |
ERP Measure | Frequency Band | Chi-Square | Parameter Estimate | Standard Error | t Value | p Value |
---|---|---|---|---|---|---|
N100 | Delta ITPC | 48.01 | −4.10 | 1.00 | −4.09 | <0.001 |
Theta ITPC | 1.30 | 0.51 | 1.37 | 0.38 | 0.255 | |
Alpha ITPC | 6.88 | −2.58 | 0.97 | −2.65 | 0.013 | |
Delta ERSP | 22.17 | −0.32 | 0.07 | −4.77 | <0.001 | |
Theta ERSP | 4.09 | −0.22 | 0.11 | −2.02 | 0.065 | |
Alpha ERSP | 0.003 | −0.005 | 0.09 | −0.054 | 0.958 | |
P200 | Delta ITPC | 133.15 | 5.27 | 1.02 | 5.16 | <0.001 |
Theta ITPC | 17.17 | 3.48 | 1.51 | 2.30 | <0.001 | |
Alpha ITPC | 3.69 | 2.17 | 1.13 | 1.92 | 0.055 | |
Delta ERSP | 62.60 | 0.62 | 0.07 | 8.38 | <0.001 | |
Theta ERSP | 2.76 | 0.09 | 0.21 | 0.40 | 0.097 | |
Alpha ERSP | 3.51 | −0.22 | 0.12 | −1.87 | 0.091 | |
N400 | Delta ITPC | 7.94 | −2.84 | 1.01 | −2.82 | 0.014 |
Theta ITPC | 2.75 | −2.67 | 1.73 | −1.66 | 0.146 | |
Alpha ITPC | 0.97 | −1.60 | 1.62 | −0.99 | 0.324 | |
Delta ERSP | 2.61 | 0.10 | 0.17 | 0.611 | 0.318 | |
Theta ERSP | 0.02 | 0.06 | 0.21 | 0.303 | 0.900 | |
Alpha ERSP | 0.27 | −0.05 | 0.10 | −0.516 | 0.901 | |
LPC | Delta ITPC | 13.49 | 3.04 | 0.82 | 3.71 | <0.001 |
Theta ITPC | 6.58 | 3.47 | 1.35 | 2.58 | 0.015 | |
Alpha ITPC | 0.44 | −0.84 | 1.28 | −0.66 | 0.509 | |
Delta ERSP | 1.48 | 0.49 | 0.15 | 3.35 | 0.225 | |
Theta ERSP | 3.36 | −0.63 | 0.20 | −3.20 | 0.100 | |
Alpha ERSP | 6.86 | 0.22 | 0.08 | 2.62 | 0.026 |
Stages | Early Stages: Basic Auditory Processing | Late Stages: Higher-Order Cognitive Processing | ||||
---|---|---|---|---|---|---|
Indices | N100 | P200 | N400 | LPC | Behavioral Identification | |
Accuracy | Reaction Time | |||||
Main effect of channel | Pro > Sem (amplitude, delta and theta ITPC & ERSP) | Pro > Sem (amplitude, all ITPC & ERSP) | Pro > Sem (amplitude, delta ITPC) | Pro > Sem (amplitude) | Pro ≈ Sem | Pro < Sem |
Main effect of emotion | Hap > Neu ≈ Sad (amplitude, delta and theta ITPC) | Hap > Neu ≈ Sad (amplitude) Hap > Neu (delta ITPC) | Sad > Hap > Neu (amplitude) | Hap ≈ Sad > Neu (amplitude) | Neu ≈ Hap > Sad | No main effect |
Main effect of task | Exp > Imp (amplitude, all ITPC and theta ERSP) | Exp > Imp (amplitude, all ITPC theta ERSP) | Exp > Imp (amplitude) Exp < Imp (delta ERSP) | Exp < Imp (amplitude, alpha ERSP) | Exp < Imp | Exp > Imp |
Interaction among factors | Pro > Sem not for sadness (amplitude) | Pro > Sem not for sadness in implicit tasks (amplitude) | Pro > Sem for neutrality in both tasks and for sadness for implicit tasks (amplitude) Sem > Pro for happiness in explicit tasks (amplitude) | Pro > Sem not for happiness in explicit tasks (amplitude) | Pro > Sem not for implicit tasks Neu ≈ Hap > Sad in explicit task only | Pro ≈ Sem |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lin, Y.; Fan, X.; Chen, Y.; Zhang, H.; Chen, F.; Zhang, H.; Ding, H.; Zhang, Y. Neurocognitive Dynamics of Prosodic Salience over Semantics during Explicit and Implicit Processing of Basic Emotions in Spoken Words. Brain Sci. 2022, 12, 1706. https://doi.org/10.3390/brainsci12121706
Lin Y, Fan X, Chen Y, Zhang H, Chen F, Zhang H, Ding H, Zhang Y. Neurocognitive Dynamics of Prosodic Salience over Semantics during Explicit and Implicit Processing of Basic Emotions in Spoken Words. Brain Sciences. 2022; 12(12):1706. https://doi.org/10.3390/brainsci12121706
Chicago/Turabian StyleLin, Yi, Xinran Fan, Yueqi Chen, Hao Zhang, Fei Chen, Hui Zhang, Hongwei Ding, and Yang Zhang. 2022. "Neurocognitive Dynamics of Prosodic Salience over Semantics during Explicit and Implicit Processing of Basic Emotions in Spoken Words" Brain Sciences 12, no. 12: 1706. https://doi.org/10.3390/brainsci12121706