Written by
Aidan Farrish
Aidan is an aPHR-certified writer on the marketing team at BerniePortal. She writes about HR, healthcare, and benefits.
Can You Use AI During Recruitment?
Updated May 19, 2023: The EEOC has released technical documentation to assess the use of AI during recruitment in efforts to prevent discrimination.
Artificial intelligence (AI) is a big topic on everyone’s mind this year, and you’re likely thinking about it, too. The utility of AI in the workplace is endless, from ChatGPT writing your copy to using AI to screen resumes.
While AI has the potential to streamline much of HR’s hiring process, it isn’t ready quite yet. Training AIs takes time, and for now, the promise of recruitment AIs outstrips their current usability.
Read on to discover more about the legal drawbacks of using current AIs during the hiring process, plus relevant criticism from legal experts.
Should AI Be Used During Recruitment?
You may have seen some organizations using AI to screen resumes, applicants, and more. For positions receiving thousands of resumes for a single opening, a technology tool that can sift through data at rates thousands of times faster than a human may prove beneficial.
If you have thousands of applications for one role, sure, check into using AI. But small to midsized businesses likely don’t have this same issue, so hiring managers—and HR—are more concerned with the quality of applicants rather than sorting massive volumes of candidates.
So should you use AI during your recruitment process? Well, there are some pros and cons, and right now, the cons outweigh the pros.
Pros: AI can streamline the recruitment process by selecting candidates based on data you provide, which can:
- Cut down on recruitment costs
- Save you time and effort
- Ensure you’re seeing the best candidates ASAP
Those are some pretty attractive reasons to use AI for recruitment, starting yesterday. So why wouldn’t you dive right in? Because while the pros seem great, the cons are significant, with one major con looming over the rest: EEOC restrictions.
Legal Drawbacks of Using AI During Recruitment
If you’re in HR, or have any hand in an organization's administration and hiring practices, you are familiar with the Equal Employment Opportunity Commission. In short, the EEOC protects employees’ rights to work, preventing discriminatory practices in the workplace and during the recruitment process.
But the EEOC evolves alongside modern employment practices, and technology has thrown it a major-league curveball in the form of screening AIs.
Technological advancements have exploded exponentially, but not all advancements are walking hand-in-hand with the EEOC’s mission. Screening AIs have a bad habit of falling into discriminatory practices, and the EEOC has taken note and made plans to step in.
The main issue with using screening AIs is that it uses data you provide to furnish its algorithm with what it thinks you want. So if you submit the application requirements and detail previous successful employees within that open role, the AI only has that data to facilitate its search.
Consider this: if your company has been around for 50 years, you may have had quite a few more men in certain roles than women. So when you supply an AI with data, it will limit its search to men since that is what your organization has hired historically. In fact, Amazon scrapped an AI hiring tool for doing this exact thing.
But the funny thing about history is that it’s in the past—and the EEOC wants to advance employment equality into the future. The EEOC rightfully has no patience for AIs that limit their search using historical data, and therefore committing anti-equal opportunity sins, but the technology isn’t quite there yet to avoid unintended discriminatory practices.
You may be thinking, “Well, that’s an easy fix, I just won’t include that kind of info so the AI doesn’t ignore the applications of quality candidates.” The thing is, AI is smart. In fact, AI is so smart it may use normal details insidiously to validate how it supplies you with candidates.
What Do Legal Experts Think of AI in the Workplace?
When you provide data to an AI, you don’t intend for it to be biased, but an AI may bias itself by using your historical hiring patterns. Pauline Kim, a law professor at Washington University in St. Louis, explains how this can impact equal opportunity in your recruitment process.
In her article “Artificial Intelligence and the Challenges of Workplace Discrimination and Privacy,” she covers how workplace AI can provide benefits but falls short due to the limits of AI’s predictive analysis. It isn’t just that your organization hired a majority of men back in the 1950s—AI tools recognize patterns that, when used in your hiring process, can highlight deeper issues with introducing AI to the workplace.
If you supply data to an AI, and many of your employees in that data played a particular sport, an AI may cut worthy applicants who played a different sport. You can guess how this may affect your hiring process, but it unveils a deeper and more problematic habit of AI: AI focuses on unimportant details and makes biased assumptions.
For example, what if your AI tool pre-selects applicants based on their zip code? If your AI tool of choice only selects applicants from affluent or racially-dominant areas, then you’re losing out on a major sector of the hiring pool.
That could ride a very thin line of discrimination, from classism to racism, if hiring managers aren’t watching closely to prevent mistakes. And even if you didn’t instruct an AI to discriminate against certain applicants, the law won’t discriminate when levying penalties against you.
Some states have already implemented measures to prevent organizations from using AI un-checked during their hiring process. New York has restricted the use of AI by instituting two policies: companies must notify applicants of any AIs used in their hiring process 10 days beforehand, and any AI tools must be audited before use.
Brian Eastwood, a columnist for HR Dive and specialist in healthcare and technology content, provides more information. For New Yorkers using AI for all of their open positions, the fines will add up: $500 for the first violation and $1,500 for each subsequent violation. Other states, like Illinois, are following suit, so the likelihood that AI tools are audited and restricted on a large scale by federal agencies increases by the day.
AI hiring tools present many possibilities, but the general consensus is that they are not ready for the marketplace. Don’t open yourself up to potential liability by using under-developed tools, but AI is worth keeping an eye on as it advances further.
Additional HR Resources
You can stay informed, educated, and up-to-date with important HR topics using BerniePortal’s comprehensive resources:
- BernieU—free online HR courses, approved for SHRM and HRCI recertification credits
- Resource Library—tools, templates, and checklists on an extensive list of HR topics
- BerniePortal Blog—a one-stop shop for HR industry news
- HR Glossary—featuring the most common HR terms, acronyms, and compliance
- HR Party of One—our popular YouTube series and podcast, covering emerging HR trends and enduring HR topics
Written by
Aidan Farrish
Aidan is an aPHR-certified writer on the marketing team at BerniePortal. She writes about HR, healthcare, and benefits.
Related Posts
Navigating the complexities of hiring can be challenging, especially when it comes to...
Thomas J. Peters, best known for his book In Search of Excellence, once stated, “The day...
The first impression an employer makes is just as important (if not more important) than...
Target Compensation is what an organization believes, in good faith, a reasonably good...
Submit a Comment