4.5 Article

Can AI be racist? Color-evasiveness in the application of machine learning to science assessments

Journal

SCIENCE EDUCATION
Volume 105, Issue 5, Pages 825-836

Publisher

WILEY
DOI: 10.1002/sce.21671

Keywords

artificial intelligence; assessment; automated scoring; English learners; machine learning

Funding

  1. National Science Foundation [DUE 1561150]

Ask authors/readers for more resources

Assessment developers are increasingly using machine learning to transform science assessments, but these models may reinforce existing inequalities. The author argues that these changes can protect the interests of the status quo and discriminate against racially marginalized students linguistically. The article concludes with tactical shifts to create a more equitable and socially just field of machine learning enhanced science assessments.
Assessment developers are increasingly using the developing technology of machine learning in transforming how to assess students in their science learning. I argue that these algorithmic models further embed the structures of inequality that are pervasive in the development of science assessments in how they legitimize certain language practices that protect the hierarchical standing of status quo interests. My argument is situated within the broader emerging ethical challenges around this new technology. I apply a raciolinguistic equity analysis framework in critiquing the new black box that reinforces structural forms of discrimination against the linguistic repertoires of racially marginalized student populations. The article ends with me sharing a set of tactical shifts that can be deployed to form a more equitable and socially-just field of machine learning enhanced science assessments.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available