Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm
Abstract
:1. Introduction
1.1. Electrophysiological Studies of Audiovisual Fusion
1.1.1. The N1–P2 in AV Speech Perception
1.1.2. The MMN in AV Speech Perception
1.2. The Present Study
2. Materials and Methods
2.1. Participants
2.2. Materials
2.3. Procedure
2.4. EEG Preprocessing and Analysis
3. Results
3.1. Usable Trials
3.2. ERP Results
3.2.1. Unimodal Auditory Results
3.2.2. Unimodal Visual Results
3.2.3. Audiovisual Results
3.2.4. Posthoc Analyses
4. Discussion
4.1. N1–P2
4.2. Early and Late MMN
4.3. Modality-Sensitive Illusory Perception
4.4. Limitations and Future Directions
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- McGurk, H.; MacDonald, J. Hearing lips and seeing voices. Nature 1976, 264, 746–748. [Google Scholar] [CrossRef] [PubMed]
- Van Wassenhove, V.; Grant, K.W.; Poeppel, D. Visual speech speeds up the neural processing of auditory speech. Proc. Natl. Acad. Sci. USA 2005, 102, 1181–1186. [Google Scholar] [CrossRef] [Green Version]
- Van Wassenhove, V. Speech through ears and eyes: Interfacing the senses with the supramodal brain. Front. Psychol. 2013, 4, 388. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Venezia, J.H.; Thurman, S.M.; Matchin, W.; George, S.E.; Hickok, G. Timing in audiovisual speech perception: A mini review and new psychophysical data. Atten. Percept. Psychophys. 2016, 78, 583–601. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Altieri, N. Multisensory integration, learning, and the predictive coding hypothesis. Front. Psychol. 2014, 5, 257. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tiippana, K. What is the McGurk effect? Front. Psychol. 2014, 5, 725. [Google Scholar] [CrossRef] [Green Version]
- Nath, A.R.; Beauchamp, M.S. A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion. Neuroimage 2012, 59, 781–787. [Google Scholar] [CrossRef] [Green Version]
- Green, K.P.; Norrix, L.W. Acoustic cues to place of articulation and the McGurk effect: The role of release bursts, aspiration, and formant transitions. J. Speech Lang. Hear. Res. 1997, 40, 646–665. [Google Scholar] [CrossRef]
- Baum, S.H.; Martin, R.C.; Hamilton, A.C.; Beauchamp, M.S. Multisensory speech perception without the left superior temporal sulcus. Neuroimage 2012, 62, 1825–1832. [Google Scholar] [CrossRef] [Green Version]
- Colin, C.; Radeau, M.; Deltenre, P.; Demolin, D.; Soquet, A. The role of sound intensity and stop-consonant voicing on McGurk fusions and combinations. Eur. J. Cogn. Psychol. 2002, 14, 475–491. [Google Scholar] [CrossRef] [Green Version]
- Den Ouden, H.E.; Daunizeau, J.; Roiser, J.; Friston, K.J.; Stephan, K.E. Striatal prediction error modulates cortical coupling. J. Neurosci. 2010, 30, 3210–3219. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Friston, K. A theory of cortical responses. Philos. Trans. R. Soc. B Biol. Sci. 2005, 360, 815–836. [Google Scholar] [CrossRef] [PubMed]
- Friston, K. The free-energy principle: A rough guide to the brain? Trends Cogn. Sci. 2009, 13, 293–301. [Google Scholar] [CrossRef] [PubMed]
- Arnal, L.H.; Morillon, B.; Kell, C.A.; Giraud, A.-L. Dual Neural Routing of Visual Facilitation in Speech Processing. J. Neurosci. 2009, 29, 13445–13453. [Google Scholar] [CrossRef]
- Chandrasekaran, C.; Trubanova, A.; Stillittano, S.; Caplier, A.; Ghazanfar, A.A. The natural statistics of audiovisual speech. PLoS Comput. Biol. 2009, 5, e1000436. [Google Scholar] [CrossRef]
- Vroomen, J.; Keetels, M. Perception of intersensory synchrony: A tutorial review. Atten. Percept. Psychophys. 2010, 72, 871–884. [Google Scholar] [CrossRef] [Green Version]
- Grant, K.W.; Van Wassenhove, V.; Poeppel, D. Detection of auditory (cross-spectral) and auditory–visual (cross-modal) synchrony. Speech Commun. 2004, 44, 43–53. [Google Scholar] [CrossRef]
- Eg, R.; Behne, D.M. Perceived synchrony for realistic and dynamic audiovisual events. Front. Psychol. 2015, 6, 736. [Google Scholar] [CrossRef] [Green Version]
- Schwartz, J.-L.; Savariaux, C. No, there is no 150 ms lead of visual speech on auditory speech, but a range of audiovisual asynchronies varying from small audio lead to large audio lag. PLoS Comput. Biol. 2014, 10, e1003743. [Google Scholar] [CrossRef]
- Karas, P.J.; Magnotti, J.F.; Metzger, B.A.; Zhu, L.L.; Smith, K.B.; Yoshor, D.; Beauchamp, M.S. The visual speech head start improves perception and reduces superior temporal cortex responses to auditory speech. eLife 2019, 8, e48116. [Google Scholar] [CrossRef]
- Shahin, A.J.; Backer, K.C.; Rosenblum, L.D.; Kerlin, J.R. Neural Mechanisms Underlying Cross-Modal Phonetic Encoding. J. Neurosci. 2018, 38, 1835–1849. [Google Scholar] [CrossRef]
- Baart, M.; Lindborg, A.; Andersen, T.S. Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception. Eur. J. Neurosci. 2017, 46, 2578–2583. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Saint-Amour, D.; De Sanctis, P.; Molholm, S.; Ritter, W.; Foxe, J.J. Seeing voices: High-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion. Neuropsychologia 2007, 45, 587–597. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hessler, D.; Jonkers, R.; Stowe, L.; Bastiaanse, R. The whole is more than the sum of its parts—Audiovisual processing of phonemes investigated with ERPs. Brain Lang. 2013, 124, 213–224. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kislyuk, D.S.; Möttönen, R.; Sams, M. Visual Processing Affects the Neural Basis of Auditory Discrimination. J. Cogn. Neurosci. 2008, 20, 2175–2184. [Google Scholar] [CrossRef]
- Wacongne, C.; Changeux, J.-P.; Dehaene, S. A neuronal model of predictive coding accounting for the mismatch negativity. J. Neurosci. 2012, 32, 3665–3678. [Google Scholar] [CrossRef]
- Pilling, M. Auditory Event-Related Potentials (ERPs) in Audiovisual Speech Perception. J. Speech Lang. Hear. Res. 2009, 52, 1073–1081. [Google Scholar] [CrossRef]
- Näätänen, R.; Paavilainen, P.; Rinne, T.; Alho, K. The mismatch negativity (MMN) in basic research of central auditory processing: A review. Clin. Neurophysiol. 2007, 118, 2544–2590. [Google Scholar] [CrossRef]
- Ritter, W.; Paavilainen, P.; Lavikainen, J.; Reinikainen, K.; Alho, K.; Sams, M.; Näätänen, R. Event-related potentials to repetition and change of auditory stimuli. Electroencephalogr. Clin. Neurophysiol. 1992, 83, 306–321. [Google Scholar] [CrossRef]
- Näätänen, R.; Paavilainen, P.; Titinen, H.; Jiang, D.; Alho, K. Attention and mismatch negativity. Psychophysiology 1993, 30, 436–450. [Google Scholar] [CrossRef]
- Colin, C.; Radeau, M.; Soquet, A.; Demolin, D.; Colin, F.; Deltenre, P. Mismatch negativity evoked by the McGurk–MacDonald effect: A phonetic representation within short-term memory. Clin. Neurophysiol. 2002, 113, 495–506. [Google Scholar] [CrossRef] [Green Version]
- Möttönen, R.; Krause, C.M.; Tiippana, K.; Sams, M. Processing of changes in visual speech in the human auditory cortex. Cogn. Brain Res. 2002, 13, 417–425. [Google Scholar] [CrossRef]
- Moran, R.J.; Campo, P.; Symmonds, M.; Stephan, K.E.; Dolan, R.J.; Friston, K.J. Free energy, precision and learning: The role of cholinergic neuromodulation. J. Neurosci. 2013, 33, 8227–8236. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rao, R.P.; Ballard, D.H. Predictive coding in the visual cortex: A functional interpretation of some extra-classical receptive-field effects. Nat. Neurosci. 1999, 2, 79–87. [Google Scholar] [CrossRef] [PubMed]
- Bhat, J.; Miller, L.M.; Pitt, M.A.; Shahin, A.J. Putative mechanisms mediating tolerance for audiovisual stimulus onset asynchrony. J. Neurophysiol. 2015, 113, 1437–1450. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hickok, G.; Poeppel, D. Towards a functional neuroanatomy of speech perception. Trends Cogn. Sci. 2000, 4, 131–138. [Google Scholar] [CrossRef]
- Hickok, G.; Poeppel, D. The cortical organization of speech processing. Nat. Rev. Neurosci. 2007, 8, 393–402. [Google Scholar] [CrossRef]
- Rauschecker, J.P.; Scott, S.K. Maps and streams in the auditory cortex: Nonhuman primates illuminate human speech processing. Nat. Neurosci. 2009, 12, 718–724. [Google Scholar] [CrossRef]
- Alsius, A.; Paré, M.; Munhall, K.G. Forty Years After Hearing Lips and Seeing Voices: The McGurk Effect Revisited. Multisens. Res. 2018, 31, 111–144. [Google Scholar] [CrossRef]
- Bhat, J.; Pitt, M.A.; Shahin, A.J. Visual context due to speech-reading suppresses the auditory response to acoustic interruptions in speech. Front. Neurosci. 2014, 8, 173. [Google Scholar] [CrossRef] [Green Version]
- Sams, M.; Aulanko, R.; Hämäläinen, M.; Hari, R.; Lounasmaa, O.V.; Lu, S.-T.; Simola, J. Seeing speech: Visual information from lip movements modifies activity in the human auditory cortex. Neurosci. Lett. 1991, 127, 141–145. [Google Scholar] [CrossRef]
- Arditi, A. Improving the design of the letter contrast sensitivity test. Investig. Ophthalmol. Vis. Sci. 2005, 46, 2225–2229. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Williams, S.B. Visibility on cathode-ray tube screens: Viewing angle. JOSA 1949, 39, 782–785. [Google Scholar] [CrossRef] [PubMed]
- Boersma, P. Praat, a system for doing phonetics by computer. Glot Int. 2001, 5, 341–345. [Google Scholar]
- Ferree, T.C.; Tucker, D.M. Development of High-resolution EEG Devices. Int. J. Bioelectromagn. 1999, 1, 4–10. [Google Scholar]
- Irwin, J.; Avery, T.; Turcios, J.; Brancazio, L.; Cook, B.; Landi, N. Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype. Brain Sci. 2017, 7, 60. [Google Scholar] [CrossRef] [Green Version]
- Irwin, J.; Avery, T.; Brancazio, L.; Turcios, J.; Ryherd, K.; Landi, N. Electrophysiological indices of audiovisual speech perception: Beyond the McGurk effect and speech in noise. Multisens. Res. 2018, 31, 39–56. [Google Scholar] [CrossRef] [Green Version]
- Tse, C.-Y.; Gratton, G.; Garnsey, S.M.; Novak, M.A.; Fabiani, M. Read my lips: Brain dynamics associated with audiovisual integration and deviance detection. J. Cogn. Neurosci. 2015, 27, 1723–1737. [Google Scholar] [CrossRef]
- Hofmann-Shen, C.; Vogel, B.O.; Kaffes, M.; Rudolph, A.; Brown, E.C.; Tas, C.; Brüne, M.; Neuhaus, A.H. Mapping adaptation, deviance detection, and prediction error in auditory processing. NeuroImage 2020, 207, 116432. [Google Scholar] [CrossRef]
- Rinne, T.; Alho, K.; Ilmoniemi, R.J.; Virtanen, J.; Näätänen, R. Separate Time Behaviors of the Temporal and Frontal Mismatch Negativity Sources. NeuroImage 2000, 12, 14–19. [Google Scholar] [CrossRef]
- Paavilainen, P.; Mikkonen, M.; Kilpeläinen, M.; Lehtinen, R.; Saarela, M.; Tapola, L. Evidence for the different additivity of the temporal and frontal generators of mismatch negativity: A human auditory event-related potential study. Neurosci. Lett. 2003, 349, 79–82. [Google Scholar] [CrossRef]
- Pratt, H.; Bleich, N.; Mittelman, N. Spatio-temporal distribution of brain activity associated with audio-visually congruent and incongruent speech and the McGurk Effect. Brain Behav. 2015, 5, e00407. [Google Scholar] [CrossRef]
- Abbott, N.T.; Shahin, A.J. Cross-modal phonetic encoding facilitates the McGurk illusion and phonemic restoration. J. Neurophysiol. 2018, 120, 2988–3000. [Google Scholar] [CrossRef] [PubMed]
- Crowley, K.E.; Colrain, I.M. A review of the evidence for P2 being an independent component process: Age, sleep and modality. Clin. Neurophysiol. 2004, 115, 732–744. [Google Scholar] [CrossRef] [PubMed]
- García-Larrea, L.; Lukaszewicz, A.-C.; Mauguiére, F. Revisiting the oddball paradigm. Non-target vs neutral stimuli and the evaluation of ERP attentional effects. Neuropsychologia 1992, 30, 723–741. [Google Scholar] [CrossRef]
- Tremblay, K.L.; Friesen, L.; Martin, B.A.; Wright, R. Test-retest reliability of cortical evoked potentials using naturally produced speech sounds. Ear Hear. 2003, 24, 225–232. [Google Scholar] [CrossRef] [PubMed]
- Näätänen, R.; Lehtokoski, A.; Lennes, M.; Cheour, M.; Huotilainen, M.; Iivonen, A.; Vainio, M.; Alku, P.; Ilmoniemi, R.J.; Luuk, A. Language-specific phoneme representations revealed by electric and magnetic brain responses. Nature 1997, 385, 432–434. [Google Scholar] [CrossRef]
- Smith, E.; Duede, S.; Hanrahan, S.; Davis, T.; House, P.; Greger, B. Seeing Is Believing: Neural Representations of Visual Stimuli in Human Auditory Cortex Correlate with Illusory Auditory Perceptions. PLoS ONE 2013, 8, e73148. [Google Scholar] [CrossRef] [Green Version]
- Roa Romero, Y.; Senkowski, D.; Keil, J. Early and late beta-band power reflect audiovisual perception in the McGurk illusion. J. Neurophysiol. 2015, 113, 2342–2350. [Google Scholar] [CrossRef] [Green Version]
- Morís Fernández, L.; Macaluso, E.; Soto-Faraco, S. Audiovisual integration as conflict resolution: The conflict of the McGurk illusion: The Conflict of the McGurk Illusion. Hum. Brain Mapp. 2017, 38, 5691–5705. [Google Scholar] [CrossRef] [Green Version]
- Strand, J.; Cooperman, A.; Rowe, J.; Simenstad, A. Individual Differences in Susceptibility to the McGurk Effect: Links With Lipreading and Detecting Audiovisual Incongruity. J. Speech Lang. Hear. Res. 2014, 57, 2322–2331. [Google Scholar] [CrossRef] [PubMed]
- Wilson, A.H.; Alsius, A.; Paré, M.; Munhall, K.G. Spatial frequency requirements and gaze strategy in visual-only and audiovisual speech perception. J. Speech Lang. Hear. Res. 2016, 59, 601–615. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cienkowski, K.M.; Carney, A.E. Auditory-visual speech perception and aging. Ear Hear. 2002, 23, 439–449. [Google Scholar] [CrossRef]
- Hickok, G.; Rogalsky, C.; Matchin, W.; Basilakos, A.; Cai, J.; Pillay, S.; Ferrill, M.; Mickelsen, S.; Anderson, S.W.; Love, T.; et al. Neural networks supporting audiovisual integration for speech: A large-scale lesion study. Cortex 2018, 103, 360–371. [Google Scholar] [CrossRef] [PubMed]
- Schwartz, J.-L. A reanalysis of McGurk data suggests that audiovisual fusion in speech perception is subject-dependent. J. Acoust. Soc. Am. 2010, 127, 1584–1594. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Condition | Standard 80% | Deviant 20% |
---|---|---|
AsVc | /ba/ + /ba/ | /ba/ + /ga/ |
VsAc | /ba/ + /ba/ | /ga/ + /ba/ |
AO | /ba/ | /ga/ |
VO | /ba/ | /ga/ |
CONDITION | |||||
---|---|---|---|---|---|
Unimodal | Bimodal | ||||
Auditory Only Mdiff μV | Visual Only Mdiff μV | VsAc Mdiff μV | AsVc Mdiff μV | ||
Time Window | N1 (50–100 ms) | n.s. | −0.430 ** LT −1.096 ** CP | −0.312 * LT | −0.531 ** LT −0.654 ** CP |
P2(100–150 ms) | n.s. | n.s. | n.s. | n.s. | |
Early MMN (150–200 ms) | −0.451 * LT | n.s. | n.s. | −0.699 ** LT −1.09 ** CP | |
Late MMN (300–400 ms | −1.119 *** FR | n.s. | −0.4756 ** LT −0.403 * CP | n.s. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Randazzo, M.; Priefer, R.; Smith, P.J.; Nagler, A.; Avery, T.; Froud, K. Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm. Brain Sci. 2020, 10, 328. https://doi.org/10.3390/brainsci10060328
Randazzo M, Priefer R, Smith PJ, Nagler A, Avery T, Froud K. Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm. Brain Sciences. 2020; 10(6):328. https://doi.org/10.3390/brainsci10060328
Chicago/Turabian StyleRandazzo, Melissa, Ryan Priefer, Paul J. Smith, Amanda Nagler, Trey Avery, and Karen Froud. 2020. "Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm" Brain Sciences 10, no. 6: 328. https://doi.org/10.3390/brainsci10060328
APA StyleRandazzo, M., Priefer, R., Smith, P. J., Nagler, A., Avery, T., & Froud, K. (2020). Neural Correlates of Modality-Sensitive Deviance Detection in the Audiovisual Oddball Paradigm. Brain Sciences, 10(6), 328. https://doi.org/10.3390/brainsci10060328