Improving breast tumor segmentation via shape-wise prior-guided information on cone-beam breast CT images.
Tongxu LinJunyu LinGuoheng HuangXiaochen YuanGuo ZhongFenfang XieJiao LiPublished in: Physics in medicine and biology (2023)
Due to the blurry edges and uneven shape of breast tumors, breast
tumor segmentation can be a challenging task. Recently, deep convolution networks
(DCNs) based approaches achieve satisfying segmentation results. However, the
learned shape information of breast tumors might be lost owing to the successive
convolution and down-sampling operations, resulting in limited performance. To this
end, we propose a novel Shape-Guided Segmentation (SGS) framework that guides
the segmentation networks to be shape-sensitive to breast tumors by prior shape
information. Different from usual segmentation networks, we guide the networks to
model shape-shared representation with the assumption that shape information of
breast tumors can be shared among samples. Specifically, on the one hand, we propose
a Shape Guiding Block (SGB) to provide shape guidance through a superpixel poolingunpooling operation and attention mechanism. On the other hand, we introduce
the Shared Classification Layer (SCL) to address the problems brought by the SGB,
including feature inconsistency and additional computational cost. Additionally, the
proposed SGB and SCL can be effortlessly incorporated into mainstream segmentation
networks (e.g., UNet) to compose the SGS, facilitating compact shape-friendly
representation learning. Experiments conducted on a private dataset and a public
dataset demonstrate the effectiveness of the SGS compared to other advanced methods.
The source code is made available at https://github.com/TxLin7/Shape-Seg.