4.6 Article

Why Is Artificial Intelligence Blamed More? Analysis of Faulting Artificial Intelligence for Self-Driving Car Accidents in Experimental Settings

期刊

出版社

TAYLOR & FRANCIS INC
DOI: 10.1080/10447318.2020.1785693

关键词

-

向作者/读者索取更多资源

This study conducted an experiment to test how the level of blame differs between an artificial intelligence (AI) and a human driver based on attribution theory and computers are social actors (CASA). It used a 2 (human vs. AI driver) x 2 (victim survived vs. victim died) x 2 (female vs. male driver) design. After reading a given scenario, participants (N = 284) were asked to assign a level of responsibility to the driver. The participants blamed drivers more when the driver was AI compared to when the driver was a human. Also, the higher level of blame was shown when the result was more severe. However, gender bias was found not to be significant when faulting drivers. These results indicate that the intention of blaming AI comes from the perception of dissimilarity and the seriousness of outcomes influences the level of blame. Implications of findings for applications and theory are discussed.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据