Access to this article is restricted until 18 months after publication by request of the publisher. Restriction lift date: 2025-04-17
“Computer says no”: Artificial Intelligence, gender bias, and epistemic injustice
dc.check.date | 2025-04-17 | en |
dc.check.info | Access to this article is restricted until 18 months after publication by request of the publisher | en |
dc.contributor.author | Walmsley, Joel | en |
dc.date.accessioned | 2023-12-05T14:45:23Z | |
dc.date.available | 2023-12-05T14:45:23Z | |
dc.date.issued | 2023-10-17 | en |
dc.description.abstract | Ever since its beginnings, the field of Artificial Intelligence (AI) has been plagued with the possibility of perpetuating a range of depressingly familiar kinds of injustice, roughly because the biases and prejudices of the programmers can be, literally, codified. But several recent controversies about biased machine translation and automated CV-evaluation highlight a novel set of concerns that are simultaneously both ethical and epistemological, and which stem from AI's most recent developments; we don't fully understand the machines we've built, they're faster, more powerful, and more complex than us, but we're growing to trust them preferentially nonetheless. This chapter examines some of the ways in which Miranda Fricker's concept(s) of “epistemic injustice” can highlight such problems, and concludes with suggestions about re-conceiving human-AI interaction—along the model of collaboration rather than competition—that might avoid them. | en |
dc.description.status | Peer reviewed | en |
dc.description.version | Accepted Version | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.citation | Walmsley, J. (2023) '"Computer says no”: Artificial Intelligence, gender bias, and epistemic injustice', in Edwards, M. L. and Palermos, S. O. (eds.) Feminist Philosophy and Emerging Technologies. Abingdon: Routledge, pp. 249-263. doi: 10.4324/9781003275992-16 | en |
dc.identifier.doi | 10.4324/9781003275992-16 | en |
dc.identifier.endpage | 263 | en |
dc.identifier.isbn | 9781003275992 | en |
dc.identifier.startpage | 249 | en |
dc.identifier.uri | https://hdl.handle.net/10468/15300 | |
dc.language.iso | en | en |
dc.publisher | Routledge | en |
dc.relation.ispartof | Feminist Philosophy and Emerging Technologies | en |
dc.rights | © 2023, Joel Walmsley. All rights reserved. This is an Accepted Manuscript of a book chapter published by Routledge in Feminist Philosophy and Emerging Technologies on 17 October 2023, available online: https://doi.org/10.4324/9781003275992-16 | en |
dc.subject | Artificial Intelligence | en |
dc.subject | AI | en |
dc.subject | Injustice | en |
dc.subject | Bias | en |
dc.subject | Prejudice | en |
dc.subject | Biased machine translation | en |
dc.subject | Automated CV-evaluation | en |
dc.subject | Epistemic injustice | en |
dc.title | “Computer says no”: Artificial Intelligence, gender bias, and epistemic injustice | en |
dc.type | Book chapter | en |
Files
Original bundle
1 - 2 of 2
Loading...
- Name:
- Walmsley - Computer Says No - Final with abstract.pdf
- Size:
- 268.93 KB
- Format:
- Adobe Portable Document Format
- Description:
- Accepted Version
Loading...
- Name:
- Walmsley - Computer Says No - Final with abstract.docx
- Size:
- 46.58 KB
- Format:
- Microsoft Word XML
- Description:
- Author's original accepted version
License bundle
1 - 1 of 1
Loading...
- Name:
- license.txt
- Size:
- 2.71 KB
- Format:
- Item-specific license agreed upon to submission
- Description: