The Open Neuroscience Journal
2013, 7 : 5-18Published online 2013 October 18. DOI: 10.2174/1874082001307010005
Publisher ID: TONEURJ-7-5
Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study
ABSTRACT
This study investigated the extent to which audiovisual speech integration is special by comparing behavioral and neural measures using both speech and non-speech stimuli. An audiovisual recognition experiment presenting listeners with auditory, visual, and audiovisual stimuli was implemented. The auditory component consisted of sine wave speech, and the visual component consisted of point light displays, which include point-light dots that highlight a talker's points of articulation. In the first phase, listeners engaged in a discrimination task where they were unaware of the linguistic nature of the auditory and visual stimuli. In the second phase, they were informed that the auditory and visual stimuli were spoken utterances of /be/ (“bay”) and /de/ (“day”), and they engaged in the same task. The neural dynamics of audiovisual integration was investigated by utilizing EEG, including mean Global Field Power and current density reconstruction (CDR). As predicted, support for divergent regions of multisensory integration between the speech and nonspeech stimuli was obtained, namely greater posterior parietal activation in the non-speech condition. Conversely, reaction- time measures indicated qualitatively similar multisensory integration across experimental conditions.