Teacher-student guided knowledge distillation for unsupervised convolutional neural network-based speckle tracking in ultrasound strain elastography.
Tianqiang XiangYan LiHui DengChao TianBo PengJingfeng JiangPublished in: Medical & biological engineering & computing (2024)
Accurate and efficient motion estimation is a crucial component of real-time ultrasound elastography (USE). However, obtaining radiofrequency ultrasound (RF) data in clinical practice can be challenging. In contrast, although B-mode (BM) data is readily available, elastographic data derived from BM data results in sub-optimal elastographic images. Furthermore, existing conventional ultrasound devices (e.g., portable devices) cannot provide elastography modes, which has become a significant obstacle to the widespread use of traditional ultrasound devices. To address the challenges above, we developed a teacher-student guided knowledge distillation for an unsupervised convolutional neural network (TSGUPWC-Net) to improve the accuracy of BM motion estimation by employing a well-established convolutional neural network (CNN) named modified pyramid warping and cost volume network (MPWC-Net). A pre-trained teacher model based on RF is utilized to guide the training of a student model using BM data. Innovations outlined below include employing spatial attention transfer at intermediate layers to enhance the guidance effect of the model. The loss function consists of smoothness of the displacement field, knowledge distillation loss, and intermediate layer loss. We evaluated our method on simulated data, phantoms, and in vivo ultrasound data. The results indicate that our method has higher signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) values in axial strain estimation than the model trained on BM. The model is unsupervised and requires no ground truth labels during training, making it highly promising for motion estimation applications.