Login / Signup

Optimal Encoding in Stochastic Latent-Variable Models.

Michael Everett RuleMartino SorbaroMatthias H Hennig
Published in: Entropy (Basel, Switzerland) (2020)
In this work we explore encoding strategies learned by statistical models of sensory coding in noisy spiking networks. Early stages of sensory communication in neural systems can be viewed as encoding channels in the information-theoretic sense. However, neural populations face constraints not commonly considered in communications theory. Using restricted Boltzmann machines as a model of sensory encoding, we find that networks with sufficient capacity learn to balance precision and noise-robustness in order to adaptively communicate stimuli with varying information content. Mirroring variability suppression observed in sensory systems, informative stimuli are encoded with high precision, at the cost of more variable responses to frequent, hence less informative stimuli. Curiously, we also find that statistical criticality in the neural population code emerges at model sizes where the input statistics are well captured. These phenomena have well-defined thermodynamic interpretations, and we discuss their connection to prevailing theories of coding and statistical criticality in neural populations.
Keyphrases
  • healthcare
  • genetic diversity