‹  Back to Blog

Why AI-Powered Recruitment Can Be Problematic

With recruiters juggling so much at once, AI-powered recruiting that takes some of the work off their plate can seem like a great solution. After all, artificial intelligence can identify candidates, as well as review resumes and applications , much faster than you can—freeing up your time to nurture candidates so they become more invested in your employer brand. In addition, AI can help you make decisions about which candidates to choose thanks to its predictive capabilities that identify those who are most likely to succeed in their roles and which ones you’re most likely to retain. 

While these benefits can undoubtedly make your job easier, unfortunately, the use of AI-powered recruiting tools can also cause problems, especially if you’re trying to maintain your DEIB hiring strategies. 

3 Ways AI-Powered Recruitment Can Be Problematic

Before relying on AI-powered recruitment tools, keep in mind that they may be problematic because they can open the door to multiple areas of possible unintended discrimination.

1. Gender Discrimination

You want to attract talent from underrepresented backgrounds, but using artificial intelligence can actually introduce bias into the process that stands in the way of getting the very talent you want to hire in several ways. For one, the effectiveness of AI-powered recruiting technology is only as good as how it’s programmed, so if it’s not set up to eliminate bias, it’s not going to help you reach your DEIB hiring goals. Amazon learned this lesson the hard way when, in 2018, the company had to discontinue the use of its artificial intelligence recruiting technology because it was programmed to judge new applications based on patterns in candidates’ resumes from the previous ten years. As a result, the program ended up discriminating against women applying for software developer positions because the candidate pool being used as a guide was dominated by men. Consequently, the system favored male applicants over their female counterparts.

2. Ability Discrimination

In May, the Department of Justice and the Equal Employment Opportunity Commission released guidance warning that the use of AI-powered recruiting tools may lead to discrimination against people with disabilities for multiple reasons. First, as with the Amazon scenario, the agencies explain how if an artificial intelligence platform is programmed to identify applicants that are similar to the company’s successful hires, it can lead to discrimination because people living with disabilities may not be part of the current staff. 

In addition, when companies use algorithmically-powered testing technologies to evaluate job seekers, they may discriminate against people with disabilities if they judge them on their speaking, manual, and sensory skills, rather than strictly sticking to those directly related to the job at hand.

Learn more about Untapped’s Diversity Analytics

Analyze how candidates progress through your hiring funnel
See it in action

3. Racial and Cultural Discrimination

Candidates from underserved racial and cultural groups are not spared from the potential discrimination of AI-powered recruiting practices. According to DiversityQ, when artificial intelligence is programmed to factor in where candidates live, it can use zip codes to discriminate against those living in locations populated by racially and ethnically diverse groups if the organization does not already employ people from these areas. This can cause you to miss large amounts of qualified talent solely because of the address they list on their resumes.

How to Avoid Potential AI-Powered Recruiting Issues

Since the effectiveness of AI-powered recruiting is directly related to how it’s programmed, it’s important to ensure that when algorithms are configured, the developers aren’t harboring their own unconscious biases causing discrimination that hinders your DEIB hiring. For example, if the program is created to look for words like “ambitious” on resumes, it may cause women to be discriminated against when it searches through candidate qualifications. Similarly, a well-meaning recruiter may factor in address when evaluating applications because they have concerns about whether or not candidates can realistically commute to work every day, and then inadvertently discriminate against people from underserved communities who live in a specific geographic location.

Another way to keep any potential discrimination in check is to periodically audit the AI-powered recruiting program to ensure its not derailing your DEIB objectives and preventing you from hiring the talent you need. Review the candidates who have been rejected by the system to determine if they’re from diverse backgrounds. If the number of these rejections is high, go back to the drawing board to figure out why it’s happening and how the system can be adjusted to solve the problem.

AI-powered recruiting systems can indeed make your life easier, however, they can also make your DEIB hiring much more difficult if you’re not careful. To ensure that you’re offering opportunities to candidates despite their background, take steps that will help the artificial intelligence you’re using make the best decisions so it’s not keeping the candidates you want from moving through the hiring funnel.

Hundreds of company partners are using our platform to connect, source, and engage top underrepresented talent, and even more are already a part of our Communities.

Figma
23andMe
JP Morgan
MongoDB
Zendesk

Stop setting diversity goals.
Start meeting them.

Join hundreds of businesses, from startups to Fortune 500 companies, using our platform to build diverse teams
See it in action