A sparse quantized hopfield network for online-continual memory.
Nicholas AlonsoJeffrey L KrichmarPublished in: Nature communications (2024)
An important difference between brains and deep neural networks is the way they learn. Nervous systems learn online where a stream of noisy data points are presented in a non-independent, identically distributed way. Further, synaptic plasticity in the brain depends only on information local to synapses. Deep networks, on the other hand, typically use non-local learning algorithms and are trained in an offline, non-noisy, independent, identically distributed setting. Understanding how neural networks learn under the same constraints as the brain is an open problem for neuroscience and neuromorphic computing. A standard approach to this problem has yet to be established. In this paper, we propose that discrete graphical models that learn via an online maximum a posteriori learning algorithm could provide such an approach. We implement this kind of model in a neural network called the Sparse Quantized Hopfield Network. We show our model outperforms state-of-the-art neural networks on associative memory tasks, outperforms these networks in online, continual settings, learns efficiently with noisy inputs, and is better than baselines on an episodic memory task.
Keyphrases
- neural network
- working memory
- health information
- social media
- resting state
- machine learning
- white matter
- deep learning
- electronic health record
- functional connectivity
- big data
- healthcare
- multiple sclerosis
- resistance training
- body composition
- artificial intelligence
- data analysis
- blood brain barrier
- network analysis