UW News

Kyra Wilson


November 10, 2025

People mirror AI systems’ hiring biases, study finds

A person's hands type on a laptop.

In a new UW study, 528 participants worked with simulated AI systems to select job candidates. The researchers simulated different levels of racial biases for resumes from white, Black, Hispanic and Asian men. Without suggestions, participants’ choices exhibited little bias. But when provided with recommendations, participants mirrored the AI’s biases.


October 31, 2024

AI tools show biases in ranking job applicants’ names according to perceived race and gender

A laptop with blank screen sits on a table.

University of Washington researchers found significant racial, gender and intersectional bias in how three state-of-the-art large language models ranked resumes. The models favored white-associated names 85% of the time, female-associated names only 11% of the time, and never favored Black male-associated names over white male-associated names.