Login / Signup

Ultralow Power In-Sensor Neuronal Computing with Oscillatory Retinal Neurons for Frequency-Multiplexed, Parallel Machine Vision.

Ragib AhsanHyun Uk ChaeSeyedeh Atiyeh Abbasi JalalZezhi WuJun TaoSubrata DasHefei LiuJiang-Bin WuStephen B CroninHan WangConstantine SiderisRehan Kapadia
Published in: ACS nano (2024)
In-sensor and near-sensor computing architectures enable multiply accumulate operations to be carried out directly at the point of sensing. In-sensor architectures offer dramatic power and speed improvements over traditional von Neumann architectures by eliminating multiple analog-to-digital conversions, data storage, and data movement operations. Current in-sensor processing approaches rely on tunable sensors or additional weighting elements to perform linear functions such as multiply accumulate operations as the sensor acquires data. This work implements in-sensor computing with an oscillatory retinal neuron device that converts incident optical signals into voltage oscillations. A computing scheme is introduced based on the frequency shift of coupled oscillators that enables parallel, frequency multiplexed, nonlinear operations on the inputs. An experimentally implemented 3 × 3 focal plane array of coupled neurons shows that functions approximating edge detection, thresholding, and segmentation occur in parallel . An example of inference on handwritten digits from the MNIST database is also experimentally demonstrated with a 3 × 3 array of coupled neurons feeding into a single hidden layer neural network, approximating a liquid-state machine. Finally, the equivalent energy consumption to carry out image processing operations, including peripherals such as the Fourier transform circuits, is projected to be <20 fJ/OP, possibly reaching as low as 15 aJ/OP.
Keyphrases