Originally Posted by
Forogil
Saying that it "developed" a bias makes it sound that it's thinking too much on its own; I would more say that it in some way copied the bias against female candidates already present in the hiring process; and likely other biases as well. What is unknown if the human HR it copied had real reasons to prefer males, and specifically the candidates they choose; or whether they only rejected them based on gender.
And it's not only a problem with gender - some AI-systems have copied biases based on zip-codes. There might also be other biases - above it avoided a women's college; what if the human recruiters favored their own college and avoided their archenemy in some college sport; the AI would likely copy that as well - and for the individual it sucks to be turned down for no good reason, even if you aren't in a "protected class" (and would be bad for the company - they likely want the best candidates, not stroke the ego of the recruiter).
In one sense the advantage of an AI is that it's easier to expose such biases - as you can send in many applications and just vary the school/women's chess club/... and clearly see the difference; whereas human recruiters will have a larger variation and might notice the two applications next to each other.