Highly Reliable 3D Channel Memory and Its Application in a Neuromorphic Sensory System for Hand Gesture Recognition.
Dohyung KimCheong Beom LeeKyu Kwan ParkHyeonsu BangPhuoc Loc TruongJongmin LeeBum Ho JeongHakjun KimSang Min WonDo Hwan KimDaeho LeeJong Hwan KoHyoung Won BaacKyeounghak KimHui Joon ParkPublished in: ACS nano (2023)
Brain-inspired neuromorphic computing systems, based on a crossbar array of two-terminal multilevel resistive random-access memory (RRAM), have attracted attention as promising technologies for processing large amounts of unstructured data. However, the low reliability and inferior conductance tunability of RRAM, caused by uncontrollable metal filament formation in the uneven switching medium, result in lower accuracy compared to the software neural network (SW-NN). In this work, we present a highly reliable CoO x -based multilevel RRAM with an optimized crystal size and density in the switching medium, providing a three-dimensional (3D) grain boundary (GB) network. This design enhances the reliability of the RRAM by improving the cycle-to-cycle endurance and device-to-device stability of the I-V characteristics with minimal variation. Furthermore, the designed 3D GB-channel RRAM (3D GB-RRAM) exhibits excellent conductance tunability, demonstrating high symmetricity (624), low nonlinearity (β LTP /β LTD ∼ 0.20/0.39), and a large dynamic range ( G max / G min ∼ 31.1). The cyclic stability of long-term potentiation and depression also exceeds 100 cycles (10 5 voltage pulses), and the relative standard deviation of G max / G min is only 2.9%. Leveraging these superior reliability and performance attributes, we propose a neuromorphic sensory system for finger motion tracking and hand gesture recognition as a potential elemental technology for the metaverse. This system consists of a stretchable double-layered photoacoustic strain sensor and a crossbar array neural network. We perform training and recognition tasks on ultrasonic patterns associated with finger motion and hand gestures, attaining a recognition accuracy of 97.9% and 97.4%, comparable to that of SW-NN (99.8% and 98.7%).