Login / Signup

Bayesian and Discriminative Models for Active Visual Perception across Saccades.

Divya SubramanianJohn M PearsonMarc A Sommer
Published in: eNeuro (2023)
The brain interprets sensory inputs to guide behavior, but behavior itself disrupts sensory inputs. Perceiving a coherent world while acting in it constitutes active perception. For example, saccadic eye movements displace visual images on the retina and yet the brain perceives visual stability. Because this percept of visual stability has been shown to be influenced by prior expectations, we tested the hypothesis that it is Bayesian. The key prediction was that priors would be used more as sensory uncertainty increases. Humans and rhesus macaques reported whether an image moved during saccades. We manipulated both prior expectations and levels of sensory uncertainty. All psychophysical data were compared with the predictions of Bayesian ideal observer models. We found that humans were Bayesian for continuous judgments. For categorical judgments, however, they were anti-Bayesian: they used their priors less with greater uncertainty. We studied this categorical result further in macaques. The animals' judgments were similarly anti-Bayesian for sensory uncertainty caused by external, image noise, but Bayesian for uncertainty due to internal, motor-driven noise. A discriminative learning model explained the anti-Bayesian effects. We conclude that active vision uses both Bayesian and discriminative models depending on task requirements (continuous vs. categorical) and the source of uncertainty (image noise vs. motor-driven noise). In the context of previous knowledge about the saccadic system, our results provide an example of how the comparative analysis of Bayesian vs. non-Bayesian models of perception offers novel insights into underlying neural organization. SIGNIFICANCE STATEMENT Primate vision deals with two major sources of uncertainty: suppression from eye movements and noise in the environment. Fortunately, the brain also has prior knowledge about the body and the world. Systems that exploit such priors more to compensate for greater uncertainty are considered Bayesian. A major theme in neuroscience is that the brain is Bayesian. We tested that hypothesis for vision in the context of eye movements using an integrated computational-psychophysical approach. Bayesian models explained perception during movement-induced noise, but not environmental noise, for which a simpler, "discriminative" model sufficed. We conclude that primate vision is Bayesian to compensate for intrinsic, but not extrinsic, sources of uncertainty, an important distinction for designing and interpreting neural studies of perception.
Keyphrases
  • air pollution
  • deep learning
  • healthcare
  • resting state
  • multiple sclerosis
  • oxidative stress
  • climate change
  • functional connectivity
  • human health
  • subarachnoid hemorrhage
  • convolutional neural network