AI-Based Recruiting Tools May Not Eliminate Bias
More and more employers are using artificial intelligence (AI) as a tool for recruiting and hiring new talent. AI recruiting tools are designed to support HR teams throughout the hiring process, from placing ads on job boards and screening potential candidates to video interviews that measure a candidate’s strengths based on factors such as facial expression, speech patterns and vocal tone.
Companies typically rely on automated decision systems to simplify the process of analyzing a large number of job applicants, as well as to find workers with a niche skill set or level of experience. For example, employers looking to fill technology roles such as software engineers, machine learning engineers and data scientists, may use automated decision systems to find and connect with potential candidates.
While utilizing AI in pre-employment assessments and interviews can help employers streamline their recruiting processes, there are risks involved. Research has shown that the use of AI by employers can introduce bias and perpetuate disparities in hiring. One particular area of concern is workers with disabilities, who may be at a greater disadvantage than other candidates based on the way AI tools evaluate them. For example, tools that measure speech patterns in a video interview may screen out candidates with a speech impediment, while tracking keyboard inputs may eliminate candidates with arthritis or other conditions that limit dexterity.
In addition to lacking reliability, AI systems that measure personality can be integrated into hiring without the knowledge of recruiters or applicants. One recent study by New York University found that while many Fortune 500 companies utilize AI-based solutions to sift through the millions of job applications they receive each year, these companies often resist identifying the technology they are using – despite evidence that some automated decision-making systems make arbitrary decisions
If human resource teams and job seekers are not aware of how AI technology is affecting the evaluation process, the algorithm could be amplifying bias and discriminatory practices without raising any red flags. This has prompted questions about the fairness, quality and accuracy of their hiring decisions, and the degree to which companies should be held accountable.
Local, state, federal and international officials are evaluating how to regulate these algorithms. In 2022, documents issued by the U.S. Equal Employment Opportunity Commission and the U.S. Department of Justice cautioned that employers’ “blind reliance” on AI decision-making tools – particularly when used to hire, monitor performance or establish other terms and conditions of employment – may discriminate against people with disabilities in violation of the Americans with Disabilities Act.
A New York City law that went into effect in January 2023 is aimed at curbing hiring bias that can occur when businesses use AI tools to screen job candidates. Employers in the city are banned from using automated employment decision tools to screen job candidates unless the technology has been subject to a bias audit conducted no more than one year before putting the tool to use. While other legislation has addressed specific aspects of the hiring process, such as the Illinois law that has regulated the use of AI analysis of video interviews since 2020, the New York City law is the first in the U.S. to apply to the process as a whole.
Experts, regulators and human resource professionals agree that AI is changing the way many companies hire. However, a Pew Research Center survey found that most American job seekers are not keen on being evaluated by a computer. About two-thirds (66%) of respondents said they would not want to apply for a job with an employer that uses AI to help make hiring decisions, citing the fact
that such systems might ignore the “human side” of evaluating applicants.
While human beings remain integral to the recruiting process, the willingness by more companies to expand AI’s role is fueling the debate about how far its influence in hiring will go. As with many other HR practices and policies, employers need to carefully manage the implementation of these recruiting and hiring systems to avoid ethical and legal concerns. If you have questions or concerns regarding any issues related to employment discrimination, please contact us at 973.707.3322 or LFarber@LFarberLaw.com.
The contents of this writing are intended for general information purposes only and should not be construed as legal advice or opinion in any specific facts or circumstances.