PEAT Talks to the Experts
We spoke with experts from four companies (“Technology Innovators”) that focus on inclusive hiring to help put your organization on the fast track to using AI equitably. Each company is based in the United States and is led by people with disabilities. These innovators shared insights and best practices learned over years of working with AI hiring tools and job candidates from underrepresented groups. We highlight five essential tips from these conversations.
“We provide explainability information for each recommendation to make sure customers understand our AI matching results.”
Transparency is key when your company procures any technology, but it is crucial for AI hiring tools. Your vendor needs to be able to accurately and meaningfully explain how the tool makes automated decisions. In addition, the vendor must be able to share information about the data the tool was trained on, the limitations of intended use, how you can fine-tune it, and more. Using outdated data sets and not being able to refine them can quickly lead to biased hiring practices.
Bottom Line: Request an Explainable AI statement from vendors before purchasing a technology.
“We also know that recruitment needs to take a human-first approach in order to work better for all people, so we needed to go beyond skills and experience.”
The innovators we spoke to all echoed the same sentiment – always include humans in the process! While AI-enabled tools are designed to simplify recruiting and hiring, they are not meant to replace humans. Many companies incorrectly assume the tools they use for hiring and recruiting are trained on inclusive data sets and do not require human intervention. Instead, it takes a combined approach. Employers must use inclusive data sets refined by people to account for the diverse characteristics of actual job candidates.
Bottom Line: AI technology is here to help recruiters and hiring managers rather than to replace them.
“…turn the AI bias [that is typically] against people with disabilities on its head to be for people with disabilities.“
You should notify candidates when AI is used, describe how the system works, and tell them about known limitations. You should also offer candidates the chance to request reasonable accommodations and let them know who has access to that data if they choose to disclose their disability status voluntarily. A hiring system should be built to prioritize data privacy. Organizations should work with their data analysis teams or outside vendors to ensure there is always a lawful basis for the processing of data. The personal data collected or processed should be job-related to help employers find accurate and up-to-date information about candidates and for candidates to be matched to jobs and opportunities that are optimized for their skills and experience.
The OECD AI Policy Observatory (OECD.AI) AI Principles on transparency and explainability (Principle 1.3) states:
“AI Actors should commit to transparency and responsible disclosure regarding AI systems. To this end, they should provide meaningful information, appropriate to the context, and consistent with the state of art:
- to foster a general understanding of AI systems;
- to make stakeholders aware of their interactions with AI systems, including in the workplace;
- to enable those affected by an AI system to understand the outcome; and
- to enable those adversely affected by an AI system to challenge its outcome based on plain and easy-to-understand information on the factors and the logic that served as the basis for the prediction, recommendation, or decision.”
Bottom Line: Let candidates know how you use their disability disclosure data and use it wisely.
“Our business is run and managed by people with disabilities, and the coding of our platform is built with disability in mind.”
Every technology in your recruiting and hiring process should be accessible. During our discussions, each participant mentioned that AI-enabled tools often contain accessibility barriers in their user interfaces. Companies should address these barriers as well as explore ways to use AI to enhance inclusion. A good example from one of these companies is their voice-activated AI chatbot. A chatbot is an online feature designed to simulate conversation with humans. A chatbot is especially helpful for candidates with mobility disabilities because it allows them to apply for jobs using their voices. Candidates say their employment history aloud, and the chatbot then produces a list of suggested positions matching their skills.
Bottom Line: Use AI to create an inclusive experience for job candidates at every stage of recruitment.
“The goal [of] our platform is to provide a level playing field to eliminate implicit biases in the recruiting process. Protected personal characteristics such as gender, age or ethnicity are never used as [qualification factors] in our models.”
The innovators we spoke to strongly emphasized the importance of addressing systemic barriers within the recruitment and hiring processes. They expressed a common desire to level the playing field and eliminate bias. In fact, none of these companies use protected personal characteristics such as gender, age, or ethnicity as qualification factors in their models. With finely tuned algorithms, these personal attributes are hidden or made anonymous from a candidate’s profile. This reduces the chance that unconscious bias will impact hiring decisions. In addition, innovators may offer ways to scrub protected personally identifiable information from a profile. This scrubbing can help ensure that implicit or explicit biases don’t creep into the hiring process. This is especially important for multiply marginalized candidates.
Bottom Line: Consider how hiring tools can hide protected personal attributes that may create bias in screening and hiring decisions.
 Reference to the four companies and their insights is for informational purposes only to demonstrate best practices. Such reference should not be interpreted as official endorsement of those entities, their ideas, products, or services by the U.S. Department of Labor.