A Systematic Study on Electromyography-Based Hand Gesture Recognition for Assistive Robots Using Deep Learning and Machine Learning Models.
Pranesh GopalAmandine GestaAbolfazl MohebbiPublished in: Sensors (Basel, Switzerland) (2022)
Upper limb amputation severely affects the quality of life and the activities of daily living of a person. In the last decade, many robotic hand prostheses have been developed which are controlled by using various sensing technologies such as artificial vision and tactile and surface electromyography (sEMG). If controlled properly, these prostheses can significantly improve the daily life of hand amputees by providing them with more autonomy in physical activities. However, despite the advancements in sensing technologies, as well as excellent mechanical capabilities of the prosthetic devices, their control is often limited and usually requires a long time for training and adaptation of the users. The myoelectric prostheses use signals from residual stump muscles to restore the function of the lost limbs seamlessly. However, the use of the sEMG signals in robotic as a user control signal is very complicated due to the presence of noise, and the need for heavy computational power. In this article, we developed motion intention classifiers for transradial (TR) amputees based on EMG data by implementing various machine learning and deep learning models. We benchmarked the performance of these classifiers based on overall generalization across various classes and we presented a systematic study on the impact of time domain features and pre-processing parameters on the performance of the classification models. Our results showed that Ensemble learning and deep learning algorithms outperformed other classical machine learning algorithms. Investigating the trend of varying sliding window on feature-based and non-feature-based classification model revealed interesting correlation with the level of amputation. The study also covered the analysis of performance of classifiers on amputation conditions since the history of amputation and conditions are different to each amputee. These results are vital for understanding the development of machine learning-based classifiers for assistive robotic applications.