Login / Signup

Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma.

Tirtha ChandaKatja HauserSarah HobelsbergerTabea-Clara BucherCarina Nogueira GarciaChristoph WiesHarald KittlerPhilipp TschandlCristian Navarrete-DechentSebastian PodlipnikEmmanouil ChousakosIva CrnaricJovana MajstorovicLinda AlhajwanTanya ForemanSandra PeternelSergei Sarapİrem ÖzdemirRaymond L BarnhillMar Llamas-VelascoGabriela PochSören KorsingWiebke SondermannFrank Friedrich GellrichMarkus V HepptMichael ErdmannSebastian HaferkampKonstantin DrexlerMatthias GoebelerBastian SchillingJochen Sven UtikalKamran GhoreschiStefan FröhlingEva Krieghoff-Henningnull nullTitus Josef Brinker
Published in: Nature communications (2024)
Artificial intelligence (AI) systems have been shown to help dermatologists diagnose melanoma more accurately, however they lack transparency, hindering user acceptance. Explainable AI (XAI) methods can help to increase transparency, yet often lack precise, domain-specific explanations. Moreover, the impact of XAI methods on dermatologists' decisions has not yet been evaluated. Building upon previous research, we introduce an XAI system that provides precise and domain-specific explanations alongside its differential diagnoses of melanomas and nevi. Through a three-phase study, we assess its impact on dermatologists' diagnostic accuracy, diagnostic confidence, and trust in the XAI-support. Our results show strong alignment between XAI and dermatologist explanations. We also show that dermatologists' confidence in their diagnoses, and their trust in the support system significantly increase with XAI compared to conventional AI. This study highlights dermatologists' willingness to adopt such XAI systems, promoting future use in the clinic.
Keyphrases
  • artificial intelligence
  • skin cancer
  • machine learning
  • big data
  • deep learning
  • health information
  • primary care
  • clinical trial
  • healthcare