4.7 Article

The three ghosts of medical AI: Can the black-box present deliver?

期刊

ARTIFICIAL INTELLIGENCE IN MEDICINE
卷 124, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.artmed.2021.102158

关键词

Xai; Black-box; Ethics; Challenges; Transparency; Autonomy

向作者/读者索取更多资源

This article uses the analogy of the three Christmas ghosts to guide readers through the past, present, and future of medical AI. It highlights the reliance on opaque models in modern machine learning and discusses the implications for transparency in healthcare. The article argues that opaque models lack quality assurance, trust, and hinder physician-patient dialogue, and suggests upholding transparency in model design and validation to ensure the success of medical AI.
Our title alludes to the three Christmas ghosts encountered by Ebenezer Scrooge in A Christmas Carol, who guide Ebenezer through the past, present, and future of Christmas holiday events. Similarly, our article takes readers through a journey of the past, present, and future of medical AI. In doing so, we focus on the crux of modern machine learning: the reliance on powerful but intrinsically opaque models. When applied to the healthcare domain, these models fail to meet the needs for transparency that their clinician and patient end-users require. We review the implications of this failure, and argue that opaque models (1) lack quality assurance, (2) fail to elicit trust, and (3) restrict physician-patient dialogue. We then discuss how upholding transparency in all aspects of model design and model validation can help ensure the reliability and success of medical AI.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据