Login / Signup

The phase of cortical oscillations determines the perceptual fate of visual cues in naturalistic audiovisual speech.

Raphaël ThézéAnne-Lise GiraudPierre Mégevand
Published in: Science advances (2020)
When we see our interlocutor, our brain seamlessly extracts visual cues from their face and processes them along with the sound of their voice, making speech an intrinsically multimodal signal. Visual cues are especially important in noisy environments, when the auditory signal is less reliable. Neuronal oscillations might be involved in the cortical processing of audiovisual speech by selecting which sensory channel contributes more to perception. To test this, we designed computer-generated naturalistic audiovisual speech stimuli where one mismatched phoneme-viseme pair in a key word of sentences created bistable perception. Neurophysiological recordings (high-density scalp and intracranial electroencephalography) revealed that the precise phase angle of theta-band oscillations in posterior temporal and occipital cortex of the right hemisphere was crucial to select whether the auditory or the visual speech cue drove perception. We demonstrate that the phase of cortical oscillations acts as an instrument for sensory selection in audiovisual speech processing.
Keyphrases
  • working memory
  • hearing loss
  • high density
  • functional connectivity
  • resting state
  • single cell
  • multiple sclerosis
  • deep learning
  • brain injury
  • optical coherence tomography