How Good Candidates Get Screened Out
AI reflects the implicit biases of the people that design it. Models learning from biased training data may perpetuate historical bias against marginalized groups, such as people whose gender is non-binary, people of color, people with disabilities, or other minorities. Further, training data typically underrepresents marginalized groups. Because these groups include people with disabilities, mitigating [...]