Login / Signup

M 3 S-Net: multi-modality multi-branch multi-self-attention network with structure-promoting loss for low-dose PET/CT enhancement.

Dong WangChong JiangJian HeYue TengHourong QinJijun LiuXiaoping Yang
Published in: Physics in medicine and biology (2023)
Objective : PET inherently involves radiotracer injections and long scanning time, which raises concerns about the risk of radiation exposure and patient comfort. Reductions in radiotracer dosage and acquisition time can lower the potential risk and improve patient comfort, respectively, but both will also reduce photon counts and hence degrade the image quality. Therefore, it is of interest to improve the quality of low-dose PET images. Approach : A supervised multi-modality deep learning model, named M 3 S-Net, was proposed to reconstruct standard-dose PET images (60 seconds per bed position) from low-dose ones (10 seconds per bed position) and the corresponding CT images. Specifically, we designed a multi-branch convolutional neural network with multi-self-attention mechanisms, which first extracted features from PET and CT images in two separate branches and then fused the features to generate the final reconstructed PET images. Moreover, a novel multi-modality structure-promoting term was proposed in the loss function to learn the anatomical information contained in CT images. Main results : We conducted extensive numerical experiments on real clinical data collected from local hospitals. Compared with state-of-the-art methods, the proposed M 3 S-Net not only achieved higher objective metrics and better reconstructed tumors, but also performed better in preserving edges and suppressing noise and artifacts. Significance : The experimental results of quantitative metrics and qualitative displays demonstrate that the proposed M 3 S-Net can generate high-quality PET images from low-dose ones, which are competable to standard-dose PET images. This is valuable in reducing PET acquisition time and has potential applications in dynamic PET imaging.
Keyphrases