Login / Signup

A Combined mmWave Tracking and Classification Framework Using a Camera for Labeling and Supervised Learning.

Andre PearceJian Andrew ZhangRichard Yi Da Xu
Published in: Sensors (Basel, Switzerland) (2022)
Millimeter wave (mmWave) radar poses prosperous opportunities surrounding multiple-object tracking and sensing as a unified system. One of the most challenging aspects of exploiting sensing opportunities with mmWave radar is the labeling of mmWave data so that, in turn, a respective model can be designed to achieve the desired tracking and sensing goals. The labeling of mmWave datasets usually involves a domain expert manually associating radar frames with key events of interest. This is a laborious means of labeling mmWave data. This paper presents a framework for training a mmWave radar with a camera as a means of labeling the data and supervising the radar model. The methodology presented in this paper is compared and assessed against existing frameworks that aim to achieve a similar goal. The practicality of the proposed framework is demonstrated through experimentation in varying environmental conditions. The proposed framework is applied to design a mmWave multi-object tracking system that is additionally capable of classifying individual human motion patterns, such as running, walking, and falling. The experimental findings demonstrate a reliably trained radar model that uses a camera for labeling and supervision that can consistently produce high classification accuracy across environments beyond those in which the model was trained against. The research presented in this paper provides a foundation for future research in unified tracking and sensing systems by alleviating the labeling and training challenges associated with designing a mmWave classification model.
Keyphrases