EEOC-Initiated Litigation - 2026 Edition

31 | EEOC-INITIATED LITIGATION: 2026 EDITION ©2026 Seyfarth Shaw LLP Even prior to Executive Order 14281, the EEOC had not initiated litigation of its own to challenge an employer’s use of artificial intelligence in the workplace. However, the EEOC did step into the fray by filing an amicus brief in support of claims by a private plaintiff in Mobley v. Workday.73 In Mobley, the plaintiff alleges that Workday engaged in a pattern or practice of discriminatory job screening that disproportionately disqualifies African-Americans, individuals over 40, and individuals with disabilities from securing employment in violation of Title VII, Section 1981, the ADEA, the ADA, and California state law. Specifically, plaintiff alleges that Workday’s AI and algorithms are more likely to deny job applicants who are African-American, over 40, and have a disability, and plaintiff asserted that Workday acted as an employment agency, or, in the alternative, as an indirect employer or agent of the employer. Mobley seeks class certification. The district court granted Workday’s motion to dismiss Mobley’s original complaint but granted the plaintiff leave to amend his complaint. Workday again moved to dismiss. On this second round of briefing, the EEOC submitted an amicus brief taking a novel position in support of Mobley’s classaction theory that an AI vendor could be directly liable under Title VII, the ADA, or the ADEA for employment discrimination caused by the use of the vendor’s AI. Among other things, the EEOC argued that by actively making automated decisions to reject or advance candidates before referring them to employers, Workday functions as an employment agency under the law. In its amicus submission, the EEOC drew an analogy to IRS rules stating that tax preparation software can be considered a tax preparer if the software does more than just provide “mere mechanical assistance.” So too with algorithmic tools that go beyond that threshold in the employment context, according to the EEOC. The district court subsequently issued a split decision that allowed the plaintiff’s agency theory to proceed, as supported by the EEOC in its amicus brief.74 The Court’s opinion emphasized the importance of the “agency” theory in addressing potential enforcement gaps in our anti-discrimination laws. In this regard, the Court illustrated the potential gaps with a hypothetical scenario: a software vendor intentionally creates a tool that automatically screens out applicants from historically black colleges and universities, unbeknownst to the employers using the software. Without the agency theory, the Court opined, no party could be held liable for this intentional discrimination. By construing federal anti-discrimination laws broadly and adapting traditional legal concepts to the evolving relationship between AI service providers and employers, the Court’s decision was based, in part, on the desire to avoid potential loopholes in liability. By allowing the plaintiff’s agency theory to proceed, as supported by the EEOC in its amicus brief, the ruling opens the door for a significant expansion of liability for AI vendors in the hiring process, with potential far-reaching implications for both AI service providers and for employers using those tools. The district court recently granted plaintiff’s request to conditionally certify a collective action based on age. In its decision to conditionally certify the collective, t the court gave more weight to what the plaintiff claims that Workday products do than the evidence Workday presented to establish what its products do, in fact. The plaintiff claims that Workday products make “recommendations” about whether or not a candidate should be hired. For the court, that was enough to overcome Workday’s explanations that its customers are responsible for setting hiring criteria and making decisions about their applicants. The court includes brief discussion of the specific AI tools at issue, including Candidate Skills Match (“CSM”), Hired Score, Fetch, and Spotlight; however, for a majority of the opinion, the court refers generally to Workday’s “AI recommendation system,” without defining the term or discussing the 73 See Rachel See, Annette Tyman, and Samantha Brooks, EEOC Argues Vendors Using Artificial Intelligence Tools Are Subject to Title VII, the ADA and ADEA Under Novel Theories in Workday Litigation, Workplace Class Action Blog (Apr. 29, 2024), https://www.seyfarth.com/news-insights/legalupdate-eeoc-argues-vendors-using-artificial-intelligence-tools-are-subject-to-title-vii-the-ada-and-adea-under-novel-theories-in-workday-litigation. html. 74 See Rachel See and Annette Tyman, Mobley v. Workday: Court Holds AI Service Providers Could Be Directly Liable for Employment Discrimination Under “Agent” Theory, Workplace Class Action Blog (Jul. 19, 2024), https://www.seyfarth.com/news-insights/mobley-v-workday-court-holds-aiservice-providers-could-be-directly-liable-for-employment-discrimination-under-agent-theory.html.

RkJQdWJsaXNoZXIy OTkwMTQ4