Numerous studies from various parts of the world have confirmed that discrimination against candidates applying to a particular workplace is much more than a statistical error.
For example, this Statista study shows that 29% of participants felt discriminated against during their job search due to their physical appearance. Similarly, 26% of survey respondents thought the hiring manager judged them based on their employment status.
Even recent research confirms that discrimination is still prevalent in the hiring industry. For example, a survey conducted in 2020 shows that 54 percent of the 800 interviewed Americans felt discriminated against when looking for new jobs.
Unfortunately, millions of people have faced discrimination during the hiring process, and it appears like AI could be the solution to this dreadful issue. As AI tools don’t think and feel the same way we do, including them in the hiring process seemed the right thing to do.
But should we replace recruiters with AI, and what are the ramifications of this, perhaps terrible, mistake?
Is Using AI in the Hiring Process the New Normal?
Artificial Intelligence (AI) can optimize workflows and help businesses cut production costs. It has many advantages, including the ability to think and learn independently.
The most common examples of AI’s potential are manufacturing robots, marketing chat-bots, AI writing tools, and smart assistants. Having that in mind, many industries rely heavily on AI to improve workplace productivity and consumer satisfaction.
Surprisingly enough, one of these industries is the recruitment industry. With AI on their side, recruiters can avoid mundane work and improve their chances of finding ideal candidates.
Slowly but surely, AI is redefining recruiters’ roles, allowing them to be more proactive and focused on the end goal. AI hiring tools can reduce sourcing and screening times and scout for candidates’ traits that set them apart from other job seekers.
Is AI Biased?
AI tools aren’t perfect. Many AI programs and projects are far from reaching their potential. Additionally, AI hiring tools can exhibit biases, which is problematic, primarily when these tools are used in the recruitment industry.
So, for instance, an AI hiring software can be trained to recognize a white person better than a person of color, which only perpetuates discrimination and oppression of minority groups.
Amazon’s AI hiring tool is another example of bias in artificial intelligence. Its experimental recruiting engine was trained to observe patterns in resumes submitted to Amazon over 10 years. And, since most applicants were male, the hiring engine couldn’t sort through potential employees in a gender-neutral manner.
AI bias happens because humans choose AI algorithms’ data
AI is just a set of algorithms that can manage unpredictable circumstances. But, as science fiction often portrays AI as machines that exhibit human-like behavior, it’s easy to think that these tools are capable of reaching a human level of cognitive response.
However, AI hiring tools aren’t emotional. They don’t have the capacity to have an unconscious bias against marginalized groups. In other words, AI isn’t intelligent enough for any emotional response, be it positive or negative.
So, the problem isn’t AI itself. Data provided to AI algorithms is the real issue.
AI bias happens because humans choose AI algorithms’ data, and we can’t expect a sound output if we supply them with insufficient data. More importantly, AI hiring tools will become biased if something important is missing from the algorithm, such as a string of data that teaches AI how to be gender-neutral. As a result, AI can’t make unbiased decisions based on the information it is given.
Whether we care to admit it or not, we all carry around unconscious biases. And since humans provide information for AI hiring tools, so does the AI we train to perform specific tasks.
One of the possible solutions to this issue is to diversify data samples and ensure the programs have sufficient data to identify specific demographic groups.
- These tools can’t be trained to only identify job-related characteristics and strip out gender and race from the hiring process, because the kinds of attributes we think are essential for being a good employee are inherently bound up with gender and race.- University of Cambridge’s Centre for Gender Studies post-doctoral researcher Dr. Kerry Mackereth said in an interview for BBC News.
Keeping humans at the heart of recruitment is the only viable solution for fighting AI bias. Of course, AI hiring tools can help talent scouts cut work time in half, but they should continuously be monitored to ensure fair play and inclusion.