Restricted Boltzmann Machines Implemented by Spin-Orbit Torque Magnetic Tunnel Junctions.
Xiaohan LiCaihua WanRan ZhangMingkun ZhaoShilong XiongDehao KongXuming LuoBin HeShiqiang LiuJihao XiaGuoqiang YuXiu Feng HanPublished in: Nano letters (2024)
Artificial intelligence has surged forward with the advent of generative models, which rely heavily on stochastic computing architectures enhanced by true random number generators with adjustable sampling probabilities. In this study, we develop spin-orbit torque magnetic tunnel junctions (SOT-MTJs), investigating their sigmoid-style switching probability as a function of the driving voltage. This feature proves to be ideally suited for stochastic computing algorithms such as the restricted Boltzmann machines (RBM) prevalent in pretraining processes. We exploit SOT-MTJs as both stochastic samplers and network nodes for RBMs, enabling the implementation of RBM-based neural networks to achieve recognition tasks for both handwritten and spoken digits. Moreover, we further harness the weights derived from the preceding image and speech training processes to facilitate cross-modal learning from speech to image generation. Our results clearly demonstrate that these SOT-MTJs are promising candidates for the development of hardware accelerators tailored for Boltzmann neural networks and other stochastic computing architectures.
Keyphrases
- neural network
- artificial intelligence
- deep learning
- machine learning
- single molecule
- big data
- molecularly imprinted
- density functional theory
- room temperature
- primary care
- healthcare
- working memory
- hearing loss
- quality improvement
- radiation therapy
- early stage
- anterior cruciate ligament reconstruction
- transition metal
- tandem mass spectrometry
- solid phase extraction