Login / Signup

Synaptic plasticity as Bayesian inference.

Laurence AitchisonJannes JegminatJorge Aurelio MenendezJean-Pascal PfisterAlexandre PougetPeter E Latham
Published in: Nature neuroscience (2021)
Learning, especially rapid learning, is critical for survival. However, learning is hard; a large number of synaptic weights must be set based on noisy, often ambiguous, sensory information. In such a high-noise regime, keeping track of probability distributions over weights is the optimal strategy. Here we hypothesize that synapses take that strategy; in essence, when they estimate weights, they include error bars. They then use that uncertainty to adjust their learning rates, with more uncertain weights having higher learning rates. We also make a second, independent, hypothesis: synapses communicate their uncertainty by linking it to variability in postsynaptic potential size, with more uncertainty leading to more variability. These two hypotheses cast synaptic plasticity as a problem of Bayesian inference, and thus provide a normative view of learning. They generalize known learning rules, offer an explanation for the large variability in the size of postsynaptic potentials and make falsifiable experimental predictions.
Keyphrases
  • healthcare
  • risk assessment
  • air pollution
  • social media
  • monte carlo