“Computer says no”: Artificial Intelligence, gender bias, and epistemic injustice

dc.check.date2025-04-17en
dc.check.infoAccess to this article is restricted until 18 months after publication by request of the publisheren
dc.contributor.authorWalmsley, Joelen
dc.date.accessioned2023-12-05T14:45:23Z
dc.date.available2023-12-05T14:45:23Z
dc.date.issued2023-10-17en
dc.description.abstractEver since its beginnings, the field of Artificial Intelligence (AI) has been plagued with the possibility of perpetuating a range of depressingly familiar kinds of injustice, roughly because the biases and prejudices of the programmers can be, literally, codified. But several recent controversies about biased machine translation and automated CV-evaluation highlight a novel set of concerns that are simultaneously both ethical and epistemological, and which stem from AI's most recent developments; we don't fully understand the machines we've built, they're faster, more powerful, and more complex than us, but we're growing to trust them preferentially nonetheless. This chapter examines some of the ways in which Miranda Fricker's concept(s) of “epistemic injustice” can highlight such problems, and concludes with suggestions about re-conceiving human-AI interaction—along the model of collaboration rather than competition—that might avoid them.en
dc.description.statusPeer revieweden
dc.description.versionAccepted Versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.citationWalmsley, J. (2023) '"Computer says no”: Artificial Intelligence, gender bias, and epistemic injustice', in Edwards, M. L. and Palermos, S. O. (eds.) Feminist Philosophy and Emerging Technologies. Abingdon: Routledge, pp. 249-263. doi: 10.4324/9781003275992-16en
dc.identifier.doi10.4324/9781003275992-16en
dc.identifier.endpage263en
dc.identifier.isbn9781003275992en
dc.identifier.startpage249en
dc.identifier.urihttps://hdl.handle.net/10468/15300
dc.language.isoenen
dc.publisherRoutledgeen
dc.relation.ispartofFeminist Philosophy and Emerging Technologiesen
dc.rights© 2023, Joel Walmsley. All rights reserved. This is an Accepted Manuscript of a book chapter published by Routledge in Feminist Philosophy and Emerging Technologies on 17 October 2023, available online: https://doi.org/10.4324/9781003275992-16en
dc.subjectArtificial Intelligenceen
dc.subjectAIen
dc.subjectInjusticeen
dc.subjectBiasen
dc.subjectPrejudiceen
dc.subjectBiased machine translationen
dc.subjectAutomated CV-evaluationen
dc.subjectEpistemic injusticeen
dc.title“Computer says no”: Artificial Intelligence, gender bias, and epistemic injusticeen
dc.typeBook chapteren
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Walmsley - Computer Says No - Final with abstract.pdf
Size:
268.93 KB
Format:
Adobe Portable Document Format
Description:
Accepted Version
Loading...
Thumbnail Image
Name:
Walmsley - Computer Says No - Final with abstract.docx
Size:
46.58 KB
Format:
Microsoft Word XML
Description:
Author's original accepted version
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.71 KB
Format:
Item-specific license agreed upon to submission
Description: