Login / Signup

The study of plasticity has always been about gradients.

Blake Aaron RichardsKonrad Paul Kording
Published in: The Journal of physiology (2023)
The experimental study of learning and plasticity has always been driven by an implicit question: how can physiological changes be adaptive and improve performance? For example, in Hebbian plasticity only synapses from presynaptic neurons that were active are changed, avoiding useless changes. Similarly, in dopamine-gated learning synapse changes depend on reward or lack thereof and do not change when everything is predictable. Within machine learning we can make the question of which changes are adaptive concrete: performance improves when changes correlate with the gradient of an objective function quantifying performance. This result is general for any system that improves through small changes. As such, physiology has always implicitly been seeking mechanisms that allow the brain to approximate gradients. Coming from this perspective we review the existing literature on plasticity-related mechanisms, and we show how these mechanisms relate to gradient estimation. We argue that gradients are a unifying idea to explain the many facets of neuronal plasticity. Abstract figure legend. This article is protected by copyright. All rights reserved.
Keyphrases
  • machine learning
  • systematic review
  • mental health
  • multiple sclerosis
  • spinal cord
  • cerebral ischemia
  • resting state
  • white matter
  • blood brain barrier
  • artificial intelligence
  • big data