Login / Signup

Endoscopic Image Classification Based on Explainable Deep Learning.

Doniyorjon MukhtorovMadinakhon RakhmonovaMuksimova ShakhnozaYoung-Im Cho
Published in: Sensors (Basel, Switzerland) (2023)
Deep learning has achieved remarkably positive results and impacts on medical diagnostics in recent years. Due to its use in several proposals, deep learning has reached sufficient accuracy to implement; however, the algorithms are black boxes that are hard to understand, and model decisions are often made without reason or explanation. To reduce this gap, explainable artificial intelligence (XAI) offers a huge opportunity to receive informed decision support from deep learning models and opens the black box of the method. We conducted an explainable deep learning method based on ResNet152 combined with Grad-CAM for endoscopy image classification. We used an open-source KVASIR dataset that consisted of a total of 8000 wireless capsule images. The heat map of the classification results and an efficient augmentation method achieved a high positive result with 98.28% training and 93.46% validation accuracy in terms of medical image classification.
Keyphrases
  • deep learning
  • artificial intelligence
  • convolutional neural network
  • machine learning
  • big data
  • healthcare
  • binding protein