Login / Signup

Orchestrating explainable artificial intelligence for multimodal and longitudinal data in medical imaging.

Aurélie Pahud de MortangesHaozhe LuoShelley Zixin ShuAmith KamathYannick SuterMohamed ShelanAlexander PöllingerMauricio Reyes
Published in: NPJ digital medicine (2024)
Explainable artificial intelligence (XAI) has experienced a vast increase in recognition over the last few years. While the technical developments are manifold, less focus has been placed on the clinical applicability and usability of systems. Moreover, not much attention has been given to XAI systems that can handle multimodal and longitudinal data, which we postulate are important features in many clinical workflows. In this study, we review, from a clinical perspective, the current state of XAI for multimodal and longitudinal datasets and highlight the challenges thereof. Additionally, we propose the XAI orchestrator, an instance that aims to help clinicians with the synopsis of multimodal and longitudinal data, the resulting AI predictions, and the corresponding explainability output. We propose several desirable properties of the XAI orchestrator, such as being adaptive, hierarchical, interactive, and uncertainty-aware.
Keyphrases
  • artificial intelligence
  • big data
  • machine learning
  • deep learning
  • electronic health record
  • pain management
  • cross sectional
  • healthcare
  • high resolution
  • working memory
  • rna seq