Does AI Help Overcome Bias and Promote Diversity?

Advancements in AI have reduced the recruitment workload, but — as with any new technology — the changes bring their own set of challenges to the table. AI learns from human successes, and from errors. Bots have been shown to replicate our unconscious biases, presenting problems in achieving diverse hiring. So how can recruiters take advantage of the benefits of AI without falling prey to its pitfalls?

Hung Lee, CEO of WorkShape.io and curator of the newsletter Recruiting Brainfood, says we need to expect a learning curve. “We’re on a journey with AI, and we don’t necessarily know where it’s headed,” Lee says. “We’re seeing a gap between intent and outcome, and to eliminate that gap we need to take a cue from AI — we need to learn from the experiment.”

Here’s what employers need to know about how to take advantage of AI in identifying talent, no matter who they are or where they come from.

Does AI Eliminate Bias?

The question of whether AI will help eliminate bias from hiring is a hot topic in recruiting. Some experts argue that it can reduce or eliminate the selection bias that causes people to hire people most "like them" or to make decisions based on subjective perceptions of "fit," and instead allow hiring managers to make decisions based on data and abilities.

AI’s effectiveness comes down to programming, so it’s possible to place more emphasis on qualities over background. “It’s been shown that in countries where English is the native language, job candidates with non-native sounding names consistently encounter higher rejection rates, even though their resumes may be exactly the same as an application with a native-sounding name,” Lee says. AI can be useful in eliminating these implicit human biases by filtering the best candidates based on relevant qualities.

Will AI Lead to More Diversity?

On the other hand, some experts say AI’s potential to be unbiased and lead to more diverse hiring may be overblown. Amazon infamously scrapped its AI recruiting tool after learning that it did more to replicate bias than to eliminate it. So how can recruiters recognize when AI is working counter to their intentions?

“AI programmers design their algorithms to seek out the traits of successful people, but — problematically — the people in positions of success have often benefited from selection bias themselves,” says Steven Huang, head of diversity and inclusion at Culture Amp. “So AI recruiting can sometimes fall into the same trap of identifying top talent as this elite, homogeneous group.”

It’s important to understand that there’s no objective measure when it comes to sourcing top talent. The best employees could come from a community college or an Ivy League school, so it’s crucial to be aware of which biases AI is replicating.

Can AI Improve Recruiting?

One thing that most experts agree on is that AI offers employers the ability to do more far-reaching recruitment through predictive analytics and automation. This can mean dipping into a larger pool of applicants with diverse backgrounds and experiences than ever before.

AI can process complex data in a way that isn’t possible for human recruiters. For example, a strong predictor of future success is not someone’s academic grades, but their academic grades against their immediate peer group. Lee says AI can mine this data to de-bias a talent search to include schools in lower socioeconomic statuses rather than pooling talent exclusively from top-performing schools. Algorithms like these can bring a much more diverse array of candidates to the forefront.

Looking for other ways to combat bias in your organization. Check out Culture Amp's free Diversity and Inclusion Starter Kit.

Share