In 2020, the Leadership Conference on Civil & Human Rights and other advocacy groups released Civil Rights Principles for Hiring Assessment Technologies. The Center for Democracy and Technology summarized these principles, with emphasis on the elevated risks based on disability, in the report Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination? They recommend the following framework for employers to consider to prevent or reduce the inequitable impact of algorithm-driven hiring tools.
The following resource has been republished with the permission of the author, the Center for Democracy & Technology.
Employers and vendors must engage in a meaningful and robust analysis of how algorithm-driven tests or screening tools can discriminate, and how that discrimination can be eliminated.
- Look for existing discriminatory hiring patterns to correct, including patterns that may be unknown or unintended.
- Include people with lived experiences in the team that designs and tests the algorithm.
- Understand that removing demographic data from the algorithm-building process will not likely be sufficient to root out discrimination.
- When building the hiring tool’s algorithm, recognize that the data used to train an algorithm cannot truly represent all disabled applicants, and the tool may well have been designed without disabled people in mind. How will disabled candidates experience the tool? What are the risks of exclusion or bias?
- Make sure the team of developers includes people with disabilities, keeping in mind the numerous forms of disability.
- Plan ahead and create alternative testing and screening tools to accommodate applicants with disabilities in a way that does not restrict their ability to prove their skills.
- Ensure that tests taken with reasonable accommodations are given equal weight. One way to ensure this may be to divert a certain percentage of all job candidates into the alternative, non-algorithmic hiring track, to ensure that disabled candidates in that track are not improperly stigmatized.
Job-Relatedness. Algorithm-driven hiring tools look for data points that correlate to perceived successful job performance, but employers and vendors must ask whether the tool is actually measuring a person’s ability to perform the essential functions of the job. Employers must be able to explain what a tool measures and why. They should select traits that can be assessed objectively – mere correlation between some traits and job success does not justify a disparate impact on disadvantaged groups.
- Identify the skills required to perform each function of the job position and make sure the tool assesses only these skills, not traits that may be proxies for disability.
- Ask experts about any ways that applicants’ disabilities may prevent the tool from accurately evaluating their ability to perform the job’s functions. If the tool cannot be corrected, provide an alternative method to test job-related skills.
- Design the tool so that it does not screen out applicants based on more subjective traits, like “optimism” or “intensity,” that can look different in people with disabilities.
- Use metrics that are specifically related to tasks that applicants would need to be able to perform. For example, assess the applicant’s approach to driving sales, rather than how hopeful they feel about sales prospects.
Notice and Explanation: Explain to candidates how hiring tests work and how their performance will inform the hiring decision. Applicants should understand how decisions will be made so they can seek redress or accommodations. This includes information before someone takes the assessment as well as feedback on any decisions that are made about them.
- Explain to applicants in simple language how the tool may interpret traits related to disability, so they can understand when they may need to request an accommodation from the test. Develop alternative testing methods for people with disabilities and explain these options to applicants.
- Provide reasonable accommodations for different types of disabilities. Explain to applicants the steps they should take to request these accommodations.
- If the tool’s results show that an applicant’s disability prevents them from being able to do the job, be prepared to explain to the applicant how the tool produced these results and invite an explanation about whether they could do the job with reasonable accommodations.
Auditing: Employers and vendors must thoroughly and regularly audit hiring tools, before and after they are put to use. This means they must examine their tools for errors and risk of bias in the way the tools are trained, designed, and implemented, and use their findings to reduce these flaws. Best auditing practices include using an independent third-party auditor and publicly disclosing methods and results. Plan to identify, and retain, data necessary to complete audits, such as training data, designs, applicant information, assessment criteria and outputs, and ultimate hiring decisions.
- Instead of relying on statistical audits, consider the tool’s different effects on different kinds of disabilities. Evaluate how the tool could misrepresent or may have misrepresented applicants’ skills. Because people’s disabilities are so diverse, statistical auditing is unlikely to reveal how a hiring tool will impact every person based on their particular disability. Qualitative analysis is required as well.
- Work with experts in algorithms and employment discrimination who have disabilities and who know how to ensure the tool’s accuracy in assessing job-related traits.
Have a procedure for applicants to anonymously provide feedback on the hiring process. Regularly record and review the feedback and publicize areas of concern that have been expressed. Use all retained data to inform next steps to correct or compensate for the tool’s effects on people with disabilities.