Skip navigation
AI inspired virtual world with office

AI can help diversity recruiting, but ask these questions first.

By:

One solution that seems to be trending in 2023 is a growing interest in using artificial intelligence (AI) within diversity recruiting.
DATE: 04 October, 2023

One solution that seems to be trending in 2023 is a growing interest in using artificial intelligence (AI) within diversity recruiting. A growing number of AI-based tools are now available that can, in theory, remove unconscious bias and streamline the recruiting process, ultimately helping companies increase diversity and/or identify why candidates may be leaving their talent pipeline. 

I get it. I am continuously talking with recruiting software suppliers to learn more about the newest product or service offering in the space of diversity recruitment. There are so many new solutions popping up daily that it can be enough to make your head spin. 

Question #1: How does your workplace intend to use AI?

AI is being used (many times) as a replacement for the inner work that hiring individuals need to do. You can remove names all you want and create blind resumes, but eventually, the hiring team will still have to face their own biases at other milestones within the interview process. Do not use AI as a replacement tool. AI is one part of a multi-faceted solution. 

Another component is that inclusive hiring training still needs to happen for every single person on the front lines to effectively increase diversity. This includes, but is not limited to, your recruiters, hiring managers, interview teams, and workplace ambassadors. 

Question #2: Who programs and creates the AI? What are they doing to mitigate bias? 

Despite its good intentions, AI can be biased too. Think about it. Who programs and creates the AI software? Humans. As a result, bias may be baked into the design. Therefore, it is crucial that you inquire about what exactly the creators of the software are doing to mitigate their own bias.

Is the organization from which you are making your purchase doing its own internal work to reduce bias? If not, then that will be reflected in their software design, ultimately increasing the probability of bias being built into it. With that said, this does not negate your team, or your organization from holding yourselves accountable and doing the work, as mentioned above. Part of that work is regularly assessing the outcomes. 

Question #3: How will you audit for impact? 

Although AI can be a supportive tool, you still need to monitor your ATS data to find out who is getting ahead in your hiring process, who is getting left behind, and why. Do not underestimate the WHY. Consistently auditing for impact is how you identify and mitigate bias and how you play checks and balances to ensure that your AI is working as effectively as you had hoped. 

Look at your data and see if the software is delivering what it is supposed to. Is it helpful? Is it adding value? Or does your tool have side effects that you were not aware of? Depending on the answers to those questions, you can better eliminate bias and build a more inclusive and equitable hiring process within your organization. 

It is important to note here that I am not saying do not use AI. What I highly recommend is that you are thoughtful when deciding which software to use or which company to partner with. Just as you are doing the work and holding yourself accountable, asking these tough questions enables you to hold them, the creators of the artificial intelligence software, accountable as well. Only then will we be able to see true change within the hiring system as a whole.

Download our worksheet here to help you assess the risks and advantages of integrating AI tools into your hiring process to build a more diverse workplace.