Hiring AI, is it Really as ‘Unbiased’ as We Think?
- jay22324
- Jun 28, 2023
- 2 min read
Updated: Jan 24, 2024
These days a single job posting can attract hundreds and even thousands of applications, so it’s easy to see why around75% of recruiters use AI during the hiring process, such as applicant tracking systems (ATS). You’ll often hear these systems and algorithms can process resumes and applicants within milliseconds — something that might otherwise take a person a few seconds or minutes — and thus allow HR persons to more efficiently find the right fit. It’s been claimed that ATS can help eliminate bias and cut through the fat, a new era of hiring. However, as AI is increasingly appearing in our job search markets, so are its not-so-hidden biases.

It may seem contradictory to say computers can be biased, but in 2021, the U.S. Equal Employment Opportunity Commission (EEOC) launched an initiative to investigate bias in hiring AI and algorithms. This year, they came out on the other side issuing an act that concerned unfair bias with ATS toward disabled individuals. In this case, AI simply “screens out” people with disabilities because it doesn’t know how to account for offering accommodations.
Becoming so bias as to conclude that people named Jared who played lacrosse in high school would have the best job performance. And while that might be true, using those two qualifications to screen applicants is nowhere near efficient nor effective. Not all good workers are sporty Jareds, after all.
Though these ATS seem tempting, every company should recognize the type of responsibilities that come with them. Even when using a third-party AI system to screen applicants, regulations by the EEOC require data used to come to a hiring decision be kept in case of a bias claim. So, even if a company has no control over why an algorithm chose someone over another, that company can be liable for any discrimination that may have occurred.

You may be asking how something like this happens, and why it's so prevalent. The answer lies mostly with the fact that, currently, AI is not powerful enough to understand context or nuance. It learns what it's taught, and if its programmers or its data is biased (which it often unintentionally is), then it, too, processes with that same bias. For instance, Amazon’s AI preferred males because its data was taken from Amazon’s own employee history, which has been fairly male-dominant. Machine learning is “money laundering for bias, ” said Pinboard creator Maciej Cegłowski in 2016. “It’s a clean, mathematical apparatus that gives the status quo the aura of logical inevitability. The numbers don’t lie.”
Maybe it will improve in the future, but at the moment, AI is only a sum of its parts. When so prone to make errors, it begs the question of if we should be using it at all.
Looking for an alternative to websites that use AI for hiring? WorkOnward, launching Dec 9th this year, is offering a platform that connects people to people. They put you before your resume with a personalizable profile and bypass the use of AI in screening candidates. Working around AI to land a job shouldn’t be necessary, and it isn’t on WorkOnward.
Published by WOW (Work Onward) By Hannah Drake
Comments