Inter-fraction deformable image registration using unsupervised deep learning for CBCT-guided abdominal radiotherapy.
Huiqiao XieYang LeiYabo FuTonghe WangJustin RoperJeffrey D BradleyPretesh PatelTian LiuXiaofeng YangPublished in: Physics in medicine and biology (2023)
CBCTs in image-guided radiotherapy provide crucial anatomy information for patient setup and plan evaluation. Longitudinal CBCT image registration could quantify the inter-fractional anatomic changes, e.g. tumor shrinkage, daily OAR variation throughout the course of treatment. The purpose of this study is to propose an unsupervised deep learning based CBCT-CBCT deformable image registration which enables quantitative anatomic variation analysis. The proposed deformable registration workflow consists of training and inference stages that share the same feed-forward path through a spatial transformation-based network (STN). The STN consists of a global generative adversarial network (GlobalGAN) and a local GAN (LocalGAN) to predict the coarse- and fine-scale motions, respectively. The network was trained by minimizing the image similarity loss and the deformable vector field (DVF) regularization loss without the supervision of ground truth DVFs. During the inference stage, patches of local DVF were predicted by the trained LocalGAN and fused to form a whole-image DVF. The local whole-image DVF was subsequently combined with the GlobalGAN generated DVF to obtain final DVF. The proposed method was evaluated using 100 fractional CBCTs from 20 abdominal cancer patients in the experiments and 105 fractional CBCTs from a cohort of 21 different abdominal cancer patients in a holdout test. Qualitatively, the registration results show good alignment between the deformed CBCT images and the target CBCT image. Quantitatively, the average target registration error (TRE) calculated on the fiducial markers and manually identified landmarks was 1.91±1.18 mm. The average mean absolute error (MAE), normalized cross correlation (NCC) between the deformed CBCT and target CBCT were 33.42±7.48 HU, 0.94±0.04, respectively. In summary, an unsupervised deep learning-based CBCT-CBCT registration method is proposed and its feasibility and performance in fractionated image-guided radiotherapy is investigated. This promising registration method could provide fast and accurate longitudinal CBCT alignment to facilitate inter-fractional anatomic changes analysis and prediction.
Keyphrases
- deep learning
- cone beam computed tomography
- image quality
- machine learning
- convolutional neural network
- artificial intelligence
- early stage
- computed tomography
- radiation therapy
- magnetic resonance imaging
- high resolution
- healthcare
- radiation induced
- locally advanced
- single cell
- optical coherence tomography
- body composition
- air pollution
- health information
- high intensity
- mass spectrometry
- electronic health record
- network analysis