Fundamental aspects of noise in analog-hardware neural networks.
Nadezhda SemenovaX PorteL AndreoliMaxime JacquotLaurent LargerDaniel BrunnerPublished in: Chaos (Woodbury, N.Y.) (2019)
We study and analyze the fundamental aspects of noise propagation in recurrent as well as deep, multilayer networks. The motivation of our study is neural networks in analog hardware; yet, the methodology provides insight into networks in general. Considering noisy linear nodes, we investigate the signal-to-noise ratio at the network's outputs, which determines the upper limit of computational precision. We consider additive and multiplicative noise, which can be purely local as well as correlated across populations of neurons. This covers the chief internal-perturbations of hardware networks, and noise amplitudes were obtained from a physically implemented neural network. Analytically derived descriptions agree exceptionally well with numerical data, enabling clear identification of the components critical for management and mitigation of noise. We find that analog neural networks are surprisingly robust, in particular, against noisy neurons. Their uncorrelated perturbations are almost fully suppressed, while correlated noise can accumulate. Our work identifies notoriously sensitive points while highlighting a surprising robustness of such computational systems.