Login / Signup

One-shot Federated Learning on Medical Data using Knowledge Distillation with Image Synthesis and Client Model Adaptation.

Myeongkyun KangPhilip ChikontweSoopil KimKyong Hwan JinEhsan AdeliKilian M PohlSang Hyun Park
Published in: Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention (2023)
One-shot federated learning (FL) has emerged as a promising solution in scenarios where multiple communication rounds are not practical. Notably, as feature distributions in medical data are less discriminative than those of natural images, robust global model training with FL is non-trivial and can lead to overfitting. To address this issue, we propose a novel one-shot FL framework leveraging Image Synthesis and Client model Adaptation (FedISCA) with knowledge distillation (KD). To prevent overfitting, we generate diverse synthetic images ranging from random noise to realistic images. This approach (i) alleviates data privacy concerns and (ii) facilitates robust global model training using KD with decentralized client models. To mitigate domain disparity in the early stages of synthesis, we design noise-adapted client models where batch normalization statistics on random noise (synthetic images) are updated to enhance KD. Lastly, the global model is trained with both the original and noise-adapted client models via KD and synthetic images. This process is repeated till global model convergence. Extensive evaluation of this design on five small- and three large-scale medical image classification datasets reveals superior accuracy over prior methods. Code is available at https://github.com/myeongkyunkang/FedISCA.
Keyphrases
  • single cell
  • rna seq
  • deep learning
  • healthcare
  • convolutional neural network
  • air pollution
  • big data
  • artificial intelligence
  • climate change
  • electronic health record
  • mouse model