Login / Signup

Sources of bias in artificial intelligence that perpetuate healthcare disparities-A global review.

Leo Anthony CeliJacqueline CelliniMarie-Laure CharpignonEdward Christopher DeeFranck DernoncourtRene EberWilliam Greig MitchellLama MoukheiberJulian SchirmerJulia SituJoseph PaguioJoel C ParkJudy Gichoya WawiraJasper Seth Yaonull null
Published in: PLOS digital health (2022)
U.S. and Chinese datasets and authors were disproportionately overrepresented in clinical AI, and almost all of the top 10 databases and author nationalities were from high income countries (HICs). AI techniques were most commonly employed for image-rich specialties, and authors were predominantly male, with non-clinical backgrounds. Development of technological infrastructure in data-poor regions, and diligence in external validation and model re-calibration prior to clinical implementation in the short-term, are crucial in ensuring clinical AI is meaningful for broader populations, and to avoid perpetuating global health inequity.
Keyphrases
  • artificial intelligence
  • healthcare
  • big data
  • deep learning
  • machine learning
  • global health
  • primary care
  • physical activity
  • low cost