Noise-mitigation strategies in physical feedforward neural networks.
Nadezhda SemenovaDaniel BrunnerPublished in: Chaos (Woodbury, N.Y.) (2022)
Physical neural networks are promising candidates for next generation artificial intelligence hardware. In such architectures, neurons and connections are physically realized and do not leverage digital concepts with their practically infinite signal-to-noise ratio to encode, transduce, and transform information. They, therefore, are prone to noise with a variety of statistical and architectural properties, and effective strategies leveraging network-inherent assets to mitigate noise in a hardware-efficient manner are important in the pursuit of next generation neural network hardware. Based on analytical derivations, we here introduce and analyze a variety of different noise-mitigation approaches. We analytically show that intra-layer connections in which the connection matrix's squared mean exceeds the mean of its square fully suppress uncorrelated noise. We go beyond and develop two synergistic strategies for noise that is uncorrelated and correlated across populations of neurons. First, we introduce the concept of ghost neurons, where each group of neurons perturbed by correlated noise has a negative connection to a single neuron, yet without receiving any input information. Second, we show that pooling of neuron populations is an efficient approach to suppress uncorrelated noise. As such, we developed a general noise-mitigation strategy leveraging the statistical properties of the different noise terms most relevant in analog hardware. Finally, we demonstrate the effectiveness of this combined approach for a trained neural network classifying the modified National Institute of Standards and Technology handwritten digits, for which we achieve a fourfold improvement of the output signal-to-noise ratio. Our noise mitigation lifts the 92.07% classification accuracy of the noisy neural network to 97.49%, which is essentially identical to the 97.54% of the noise-free network.