4.1 Article

Chatbots and Gender Stereotyping

Journal

INTERACTING WITH COMPUTERS
Volume 31, Issue 2, Pages 116-121

Publisher

OXFORD UNIV PRESS
DOI: 10.1093/iwc/iwz007

Keywords

conversational user interfaces; chatbots; gender stereotypes

Ask authors/readers for more resources

Chatbots are very much an emerging technology, and there is still much to learn about how conversational user interfaces will affect the way in which humans communicate not only with computers but also with one another. Further studies on anthropomorphic agents and the projection of human characteristics onto a system are required to further develop this area. Gender stereotypes operate a profound effect on human behaviour. The application of gender to a conversational agent brings along with it the projection of user biases and preconceptions. These feelings and perceptions about an agent can be used to develop mental models of a system. Users can be inclined to measure the success of a system based on their biases and emotional connections with the agent rather than that of the system's performance. There have been many studies that show how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This chapter presents early results from a research study which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Subsequent studies could focus on examining the ethical implications of the results and further expanding the research by increasing the sample size to validate statistical significance, as well as recruiting a more diverse sample size from various backgrounds and experiences. RESEARCH HIGHLIGHTS Many studies have indicated how gender affects human perceptions of a conversational agent. However, there is limited research on the effect of gender when applied to a chatbot system. This research study presents early results which indicate that chatbot gender does have an effect on users overall satisfaction and gender-stereotypical perception. Users are more likely to apply gender stereotypes when a chatbot system operates within a gender-stereotypical subject domain, such as mechanics, and when the chatbot gender does not conform to gender stereotypes. This study raise ethical issues. Should we exploit this result and perpetuate the bias and stereotyping? Should we really have a male chatbot for technical advice bots? Is this perpetuating stereotyping, the dilemma being that a male version would elicit more trust?

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available