Comparison of Breast MRI Tumor Classification Using Human-Engineered Radiomics, Transfer Learning From Deep Convolutional Neural Networks, and Fusion Methods.
Heather M WhitneyHui LiYu JiPeifang LiuMaryellen L GigerPublished in: Proceedings of the IEEE. Institute of Electrical and Electronics Engineers (2019)
Digital image-based signatures of breast tumors may ultimately contribute to the design of patient-specific breast cancer diagnostics and treatments. Beyond traditional human-engineered computer vision methods, tumor classification methods using transfer learning from deep convolutional neural networks (CNNs) are actively under development. This article will first discuss our progress in using CNN-based transfer learning to characterize breast tumors for various diagnostic, prognostic, or predictive image-based tasks across multiple imaging modalities, including mammography, digital breast tomosynthesis, ultrasound (US), and magnetic resonance imaging (MRI), compared to both human-engineered feature-based radiomics and fusion classifiers created through combination of such features. Second, a new study is presented that reports on a comprehensive comparison of the classification performances of features derived from human-engineered radiomic features, CNN transfer learning, and fusion classifiers for breast lesions imaged with MRI. These studies demonstrate the utility of transfer learning for computer-aided diagnosis and highlight the synergistic improvement in classification performance using fusion classifiers.
Keyphrases
- deep learning
- convolutional neural network
- magnetic resonance imaging
- contrast enhanced
- endothelial cells
- machine learning
- induced pluripotent stem cells
- pluripotent stem cells
- magnetic resonance
- gene expression
- diffusion weighted imaging
- high resolution
- lymph node metastasis
- dna methylation
- genome wide
- image quality