The doctrine
AI-powered resume screeners, video-interview analysis, and performance-monitoring systems are now subject to enforcement under Title VII, the ADEA, the ADA, and a growing body of state law.
The EEOC's 2023 technical guidance applies disparate-impact doctrine to algorithmic hiring tools. Several putative class actions have followed. Mobley v. Workday (N.D. Cal.) survived a motion to dismiss in 2024, with the court holding Workday could be an "agent" of employers under Title VII for purposes of its AI-screening services — a doctrinal expansion with implications for every HR-tech vendor.
State law adds an additional layer. New York City Local Law 144 requires bias audits of automated employment decision tools and candidate notification. Illinois's AI Video Interview Act imposes consent and disclosure obligations. California's emerging FEHA regulations on automated decisionmaking will reach broader categories of employer use.
Plaintiffs face significant proof challenges: discrimination from a black-box model is difficult to demonstrate without internal data. Courts are beginning to allow targeted discovery into model architectures and training data, which is reshaping how vendors document their systems.
Leading cases
Putative class action against Workday's AI screening; vendor agency theory survived MTD.
First EEOC AI-hiring settlement, focused on age-based filtering.
Right-to-an-explanation and bias-audit theories pressed administratively.
Key holdings
- Vendors may be employer "agents". Mobley expands Title VII reach to AI HR-tech providers.
- Bias audits are mandatory in NYC. Local Law 144 requires audits and candidate disclosures.
- Discovery into models is now plausible. Courts permit targeted discovery into training data and model architectures in employment cases.