Login / Signup

Intelligent Intraoperative Haptic-AR Navigation for COVID-19 Lung Biopsy Using Deep Hybrid Model.

Yonghang TaiKai QianXiaoqiao HuangJun ZhangMian Ahmad JanZhengtao Yu
Published in: IEEE transactions on industrial informatics (2021)
A novel intelligent navigation technique for accurate image-guided COVID-19 lung biopsy is addressed, which systematically combines augmented reality (AR), customized haptic-enabled surgical tools, and deep neural network to achieve customized surgical navigation. Clinic data from 341 COVID-19 positive patients, with 1598 negative control group, have collected for the model synergy and evaluation. Biomechanics force data from the experiment are applied a WPD-CNN-LSTM (WCL) to learn a new patient-specific COVID-19 surgical model, and the ResNet was employed for the intraoperative force classification. To boost the user immersion and promote the user experience, intro-operational guiding images have combined with the haptic-AR navigational view. Furthermore, a 3-D user interface (3DUI), including all requisite surgical details, was developed with a real-time response guaranteed. Twenty-four thoracic surgeons were invited to the objective and subjective experiments for performance evaluation. The root-mean-square error results of our proposed WCL model is 0.0128, and the classification accuracy is 97%, which demonstrated that the innovative AR with deep learning (DL) intelligent model outperforms the existing perception navigation techniques with significantly higher performance. This article shows a novel framework in the interventional surgical integration for COVID-19 and opens the new research about the integration of AR, haptic rendering, and deep learning for surgical navigation.
Keyphrases