Employers Beware: Using AI in hiring?

Legal Implications of Using AI in Hiring & Recruitment (October 14th, 2025)
AI in Hiring Is Exploding
Listen up: regulators, courts, and job applicants are increasingly paying attention to employment discrimination involving the use of AI tools in hiring, retention, and termination. Particular focus is on how algorithmic decision-making tools may result in disparate impact claims under federal and state anti-discrimination laws. Below is an analysis of relevant cases that discuss this issue. While efficiency can be a game changer; legal liability is potentially right around the corner if AI is not carefully adopted and implemented.
Bias and Fairness in Algorithmic Screening
The EEOC is charged with the obligation to aid in the enforcement of Title VII laws, especially those involving disparate impact. Simply put, the EEOC must be available to review claims that AI tools used in recruiting, advancement or termination do not create criteria that causes a disparate impact on a protected group (ie, age, race, sex, national origin, religion, or disability). See DOJ & EEOC Warning: Algorithms, AI, and Disability Discrimination in Hiring (Technical Assistance) https://www.justice.gov/opa/pr/justice-department-and-eeoc-warn-against-disability-discrimination (explains how algorithmic hiring tools can violate the ADA and signals enforcement attention); EEOC involvement — Mobley v. Workday AI hiring software litigation) https://www.reuters.com/legal/transactional/eeoc-says-workday-covered-by-anti-bias-laws-ai-discrimination-case-2024-04-11/ (EEOC has argued Workday should face claims that its AI hiring software had discriminatory effects in a prominent suit); See HireVue / AI interviewing litigation and accessibility claims (2025), https://www.proskauer.com/blog/another-legal-challenge-to-an-ai-interviewing-tool (recent complaints challenge AI interviewing tools for discrimination and failure to provide ADA accommodations).
Here are three recent employment law articles which do an excellent job of identifying the current issues. The Advocate, Spring, 2025, Symposium: Employment Law, Henson Adams, Adam Sencenbaugh, and Dan Lammie, (Litigation Section of the State Bar of Texas)(2025); Algorithmic Union Busting; How the Use Of Artificial Intelligence In Hiring Could Lead To Violations of The National Labor Relations Act, Abbey Bashor, Note and Comment, 56 U. Tol. L. Rev. 335 (Spring 2025); The Promise and the Peril: Artificial Intelligence and Employment Discrimination, Keith E. Sonderling, Bradford J. Kelley & Lance Casimir, 77 U. Miami Law Review (2022).
Under federal law, artificial intelligence is described as a machine-driven system that, based on goals set by humans, is capable of generating predictions, suggestions, or decisions that affect real-world or digital environments. 15 U.S.C. § 9401(3). The use of AI is intended to speed up the identification of good candidates, and make the interviewing, hiring and onboarding processes more efficient. The articles mentioned in the previous paragraph analyzes the various stages of hiring and how AI implicates legal considerations.
For example, during the initial screening process, many recruiters use AI to help understand the nuances of the job description they received from their direct report. It could be the recruiter is screening for a particular technology, say JavaScript, but want to better understand the technology to gain confidence in candidate discussions. This educational practice, standing alone, is liking insufficient to draw any scrutiny unless the recruiter paste the AI result into the job description. Why is that a potential problem? Because the algorithm may contain biases against a particular protected group, which in turn, creates a disparate impact problem. This potential problem exists at every phase in the hiring process. Accordingly, the use of AI in generating job descriptions, posting to platforms (think LinkedIn, Indeed, and ZipRecruiter), reviewing applications and creating interview questions must be evaluated consistently to ensure there is no bias. Even if the bias is unintentional, the employer is still on the hook. Ignorance is not a defense (at least that’s what I always told my clients when they said they didn’t know what the law said). This happened in 2018 when Amazon in a self audit discovered one of its machine learning tools favored males and ended the practice.
So, the question becomes: What can we do to make sure we are doing everything possible to comply with the law. In September 2024, the Department of Labor (“DOL”) in collaboration with the Partnership on Employment & Accessible Technology (PEAT) created an AI and Inclusive Hiring Framework for employers. There are ten areas to review in considering adopting AI hiring software and tools. Though primarily addressing people with disabilities, the statement is instructive for use of all AI.
The framework outlines ten key focus areas to assist employers in implementing AI tools responsibly:
1. Identify Employment and Accessibility Legal Requirements
2. Establish Roles, Responsibilities, and Training
3. Inventory and Classify the Technology
4. Work with Responsible AI Vendors
5. Assess Possible Positive and Negative Impacts
6. Provide Accommodations
7. Use Explainable AI and Provide Notices
8. Ensure Effective Human Oversight
9. Manage Incidents and Appeals
10. Monitor Regularly
While the framework is voluntary and not legally binding, it serves as a valuable resource for organizations aiming to enhance their AI-driven recruitment strategies.
There is no one size fits all. Every company, per the EEOC, should continually engage in self-audits to ensure its practices do not violate the law. If you’re a small employer, it’s hard to imagine any state or federal agency having the resources to enforce unintentional violations based a couple of hires. The agencies’ understanding of this topic is something to be aware of, but hard-core enforcement is not a reasonable solution. I would say the same about large employers, but they need to be concerned about class actions. Good luck. Andrew. Advantage Tech, General Counsel and CEO.
*This content is for informational purposes only and is not legal advice.*