Login / Signup

Circumventing the curse of dimensionality in magnetic resonance fingerprinting through a deep learning approach.

Marco BarbieriPhilip K LeeLeonardo BriziEnrico GiampieriFrancesco SoleraGastone C CastellaniBrian A HargreavesClaudia TestaRaffaele LodiDaniel Remondini
Published in: NMR in biomedicine (2022)
Magnetic resonance fingerprinting (MRF) is a rapidly developing approach for fast quantitative MRI. A typical drawback of dictionary-based MRF is an explosion of the dictionary size as a function of the number of reconstructed parameters, according to the "curse of dimensionality", which determines an explosion of resource requirements. Neural networks (NNs) have been proposed as a feasible alternative, but this approach is still in its infancy. In this work, we design a deep learning approach to MRF using a fully connected network (FCN). In the first part we investigate, by means of simulations, how the NN performance scales with the number of parameters to be retrieved in comparison with the standard dictionary approach. Four MRF sequences were considered: IR-FISP, bSSFP, IR-FISP-B 1 , and IR-bSSFP-B 1 , the latter two designed to be more specific for B 1 + parameter encoding. Estimation accuracy, memory usage, and computational time required to perform the estimation task were considered to compare the scalability capabilities of the dictionary-based and the NN approaches. In the second part we study optimal training procedures by including different data augmentation and preprocessing strategies during training to achieve better accuracy and robustness to noise and undersampling artifacts. The study is conducted using the IR-FISP MRF sequence exploiting both simulations and in vivo acquisitions. Results demonstrate that the NN approach outperforms the dictionary-based approach in terms of scalability capabilities. Results also allow us to heuristically determine the optimal training strategy to make an FCN able to predict T 1 ,  T 2 , and M 0 maps that are in good agreement with those obtained with the original dictionary approach. k-SVD denoising is proposed and found to be critical as a preprocessing step to handle undersampled data.
Keyphrases