The neural dynamics of hierarchical Bayesian causal inference in multisensory perception.
Tim RoheAnn-Christine EhlisUta NoppeneyPublished in: Nature communications (2019)
Transforming the barrage of sensory signals into a coherent multisensory percept relies on solving the binding problem - deciding whether signals come from a common cause and should be integrated or, instead, segregated. Human observers typically arbitrate between integration and segregation consistent with Bayesian Causal Inference, but the neural mechanisms remain poorly understood. Here, we presented people with audiovisual sequences that varied in the number of flashes and beeps, then combined Bayesian modelling and EEG representational similarity analyses. Our data suggest that the brain initially represents the number of flashes and beeps independently. Later, it computes their numbers by averaging the forced-fusion and segregation estimates weighted by the probabilities of common and independent cause models (i.e. model averaging). Crucially, prestimulus oscillatory alpha power and phase correlate with observers' prior beliefs about the world's causal structure that guide their arbitration between sensory integration and segregation.
Keyphrases
- resting state
- endothelial cells
- single cell
- functional connectivity
- magnetic resonance
- high frequency
- electronic health record
- white matter
- working memory
- big data
- induced pluripotent stem cells
- computed tomography
- machine learning
- binding protein
- network analysis
- pluripotent stem cells
- multiple sclerosis
- transcription factor
- high density
- brain injury
- genetic diversity