AI Hiring Discrimination: How Algorithms Reject Qualified Candidates—and Your Right to Fight Back
Workday AI rejected 1.1 billion applications. Class action now covers millions. EEOC rolled back guidance but laws still apply. Complete guide to fight AI hiring bias.
By Compens.ai Editorial Team
Insurance Claims Expert
AI Hiring Discrimination: How Algorithms Reject Qualified Candidates—and Your Right to Fight Back
Updated: December 2025
1.1 Billion Rejections: The Algorithm That Says No
When Derek Mobley applied for jobs, he was rejected over and over—more than 80 times by companies using the same hiring software. Mobley is African American, over 40, and disabled. He believes he was never given a fair chance.
The software? Workday's automated resume screening system. And he wasn't alone. According to court filings, Workday's tools rejected 1.1 billion job applications during the relevant period. That's not a typo—over a billion human beings filtered out by an algorithm before a human ever saw their resume.
The Mobley v. Workday lawsuit, now proceeding as a class action potentially covering hundreds of millions of job applicants, represents the first major legal challenge to AI hiring tools. And it raises a fundamental question: When you apply for a job in 2025, are you being judged fairly—or filtered out by biased code?
AI Hiring by the Numbers
| Statistic | Figure | |-----------|--------| | Applications rejected by Workday tools | 1.1 billion+ | | Potential class action members | Hundreds of millions | | Companies using AI in hiring | 99% of Fortune 500 | | Resumes filtered before human review | 75%+ typically | | Black applicants less likely to be called | 36% fewer callbacks | | States with AI hiring laws/pending | 25+ |
---
Understanding AI Hiring Systems
How AI Screening Works
Most large employers now use AI-powered tools to screen job applicants:
Resume Parsers: Extract and categorize information from resumes—name, education, experience, skills.
Keyword Matching: Score candidates based on how well their resumes match job descriptions.
Predictive Analytics: Use machine learning to predict "success" based on patterns from previous hires.
Video Analysis: Analyze facial expressions, tone of voice, and word choice in video interviews.
Gamified Assessments: Evaluate cognitive abilities and personality through games and tests.
Chat Interviews: AI-powered chatbots conduct initial screening conversations.
Where Bias Enters
AI systems can discriminate in multiple ways:
Training Data Bias
If a company's existing workforce is predominantly white, male, or young, the AI learns that "successful" candidates look like existing employees. The algorithm doesn't consciously discriminate—it just replicates historical bias at scale.
Example: Amazon scrapped an AI recruiting tool in 2018 after discovering it penalized resumes containing the word "women's" and downgraded graduates of all-women's colleges.
Proxy Discrimination
Even if AI doesn't directly consider protected characteristics, it can use proxies:- •Zip code correlates with race
- •Graduation year reveals age
- •Gap in employment may indicate disability or caregiving
- •Name patterns suggest ethnicity or gender
Keyword Bias
Job requirements may inadvertently exclude protected groups:- •"Digital native" excludes older workers
- •"Prestigious university" disadvantages first-generation students
- •"Cultural fit" can mask bias
Interview AI Bias
Video interview analysis systems have been found to:- •Disadvantage non-native English speakers
- •Penalize candidates with speech disabilities
- •Score darker-skinned candidates lower
- •Prefer certain facial expressions tied to cultural norms
---
The Workday Lawsuit: A Landmark Case
What Happened
Plaintiff: Derek Mobley, an African American man over 40 with disabilities
Defendant: Workday, Inc., a major HR software company
Filed: February 2023, U.S. District Court, Northern District of California
Allegations:- •Workday's AI screening tools discriminate based on race, age, and disability
- •Mobley applied to 80+ jobs using Workday's system and was rejected every time
- •The algorithm "learns" to discriminate from biased hiring patterns
- •Workday is liable as an "employment agency" under federal anti-discrimination law
2025 Developments
May 2025: Court grants conditional certification of ADEA (Age Discrimination in Employment Act) claims as a collective action—the first class action to challenge AI screening software.
July 2025: Judge Rita Lin expands the scope to include applications processed using HiredScore AI features (Workday acquired HiredScore, an AI recruiting company).
August 2025: Workday ordered to provide list of all customers who enabled HiredScore AI features.
Why This Case Matters
Scale: With 1.1 billion rejected applications, this could be the largest employment discrimination case in history.
Precedent: First case to hold an AI vendor (not just an employer) liable for hiring discrimination.
Discovery: Court process will reveal how Workday's algorithms actually work—information normally kept secret.
Regulatory impact: Regardless of outcome, the case is spurring legislative action nationwide.
Workday's Defense
Workday denies discrimination, calling the ruling a "preliminary, procedural" decision "that relies on allegations, not evidence." The company maintains its tools are designed to reduce bias, not perpetuate it.
---
Legal Framework: Your Rights Against AI Discrimination
Federal Protections
Title VII of the Civil Rights Act (1964)
Prohibits employment discrimination based on race, color, religion, sex, or national origin. Applies to AI hiring when:- •The AI has disparate impact (discriminatory effect even without intent)
- •The employer uses AI knowing it discriminates
- •AI serves as a proxy for protected characteristics
Age Discrimination in Employment Act (ADEA)
Protects workers 40 and older. AI systems that filter by:- •Graduation year
- •Years of experience (with caps)
- •"Digital native" requirements
- •Cultural references to recent events
may violate ADEA.
Americans with Disabilities Act (ADA)
Requires reasonable accommodations and prohibits discrimination based on disability. AI concerns include:- •Video interviews penalizing speech or mobility impairments
- •Gamified assessments inaccessible to some disabilities
- •Employment gaps due to disability counting against candidates
- •Personality assessments screening out neurodiverse applicants
Genetic Information Nondiscrimination Act (GINA)
Prohibits use of genetic information in employment. Some AI health screening tools may implicate GINA.
The Disparate Impact Problem
Under federal law, employment practices that have discriminatory effects can be illegal even without discriminatory intent. This "disparate impact" theory is crucial for AI cases because:
- •AI doesn't have conscious intent
- •But AI can have discriminatory effects at massive scale
- •Effects may be provable through statistical analysis
However: In January 2025, President Trump signed an Executive Order directing federal agencies to eliminate enforcement based on disparate impact theory. This affects EEOC and DOJ enforcement but does NOT affect private lawsuits—individuals can still sue.
The EEOC Guidance Rollback
What happened: On January 27, 2025, the EEOC removed AI-related guidance from its website. This guidance, published in May 2023, explained how existing anti-discrimination laws apply to AI hiring tools.
What it means:- •Federal enforcement priorities have shifted away from AI bias
- •Underlying laws (Title VII, ADA, ADEA) still apply
- •Private litigation remains fully available
- •State enforcement is increasing
---
State Laws: The New Frontier
States With AI Hiring Laws
Illinois AI Video Interview Act (2020)
First state to regulate AI in hiring:- •Employers must notify applicants when AI analyzes video interviews
- •Must explain how AI works and what characteristics it evaluates
- •Applicants must consent before AI analysis
- •Employers must destroy videos upon request
Colorado AI Act (2024)
First comprehensive AI bias law:- •Effective February 1, 2026
- •Requires impact assessments for "high-risk" AI systems including hiring
- •Mandates risk management practices
- •Requires disclosure to consumers
- •Creates private right of action
New York City Local Law 144 (2023)
Regulates automated employment decision tools:- •Annual bias audits required
- •Results must be publicly posted
- •Candidates must be notified
- •Right to request alternative process
California (2025)
Civil Rights Council adopted final regulations on automated decision-making systems including AI hiring tools—the same tools at issue in Workday case.
States With Pending Legislation
Over 25 states have introduced AI hiring regulation bills in 2025, including:- •Texas
- •Washington
- •New Jersey
- •Massachusetts
- •Virginia
- •Georgia
- •Florida
- •And many more
---
How to Protect Yourself
Before Applying
Research the company's hiring process:- •Check Glassdoor and other reviews for mentions of AI screening
- •Look for news about the company's hiring practices
- •Note if job listings mention automated screening
- •Use keywords from the job description
- •Format resume clearly (AI struggles with creative layouts)
- •Include measurable achievements
- •Avoid graphics, tables, or unusual formatting
- •Save your resume and cover letter for each application
- •Screenshot the job posting
- •Note date and time of application
- •Record any confirmation numbers
If You Suspect Discrimination
Signs of potential AI discrimination:- •Rejection within seconds or minutes of applying
- •Pattern of rejections at companies using same software
- •Rejection despite clear qualification for role
- •No explanation provided
- •Unable to speak to human recruiter
Steps to take:
- •Request information: Ask the employer about their hiring process and whether AI was used
- •Document everything: Keep records of all applications, rejections, and communications
- •File EEOC charge: You must file with EEOC before suing under federal law
- •Filing deadline: 180 days (300 days in states with fair employment agencies)
- •Can file online at eeoc.gov
- •Consider state agencies: Many states have civil rights agencies that handle employment discrimination
- •Consult an attorney: Employment discrimination attorneys often work on contingency
For Specific Protected Characteristics
If you're over 40:- •Document if job postings mention "digital native," "recent graduate," or age-correlated terms
- •Note if you're qualified but rejected in favor of younger candidates
- •ADEA claims can now proceed as class actions per Workday ruling
- •Request reasonable accommodations for AI assessments
- •Ask for alternative screening methods if AI is inaccessible
- •Document any failures to accommodate
- •ADA requires employers to provide accessible application processes
- •Track application patterns and outcomes
- •Note if companies have poor diversity records
- •Statistical disparities can support discrimination claims
---
The Technology Problem
Why AI Bias Is Hard to Fix
Black box algorithms: Many AI systems can't explain their decisions, making bias hard to detect and prove.
Training data reflects history: If past hiring was biased, AI trained on that data perpetuates bias.
Proxy variables: Even "neutral" factors can correlate with protected characteristics.
Lack of auditing: Most AI hiring tools are not independently audited for bias.
Vendor accountability: Employers may not understand the AI they purchase.
Research on AI Hiring Bias
Studies have consistently found bias in AI hiring systems:
Resume studies: AI systems show significant preference for "white-sounding" names over identical resumes with names associated with other ethnicities.
Age bias: Algorithms trained on "successful" employees at tech companies systematically disadvantage older applicants.
Disability bias: Video analysis AI performs poorly on candidates with speech differences, facial paralysis, or movement disorders.
Gender bias: Some systems penalize career gaps (affecting women with children) or associate certain words with gender.
---
Filing Complaints and Taking Action
Federal Complaints
EEOC (Equal Employment Opportunity Commission)- •File a charge of discrimination at eeoc.gov
- •Must file before suing under federal law
- •EEOC may investigate or issue "right to sue" letter
- •Note: Federal enforcement has decreased under current administration
- •For federal contractors (many large employers)
- •OFCCP enforces affirmative action requirements
- •dol.gov/agencies/ofccp
State and Local Complaints
State civil rights agencies: Most states have agencies handling employment discrimination
Local human rights commissions: Many cities have additional protections
State attorneys general: Some states actively investigating AI hiring bias
Private Legal Action
Class actions: Workday case shows class actions against AI vendors are viable
Individual lawsuits: Can pursue individual claims under federal and state law
Collective actions: ADEA claims can proceed as "opt-in" collective actions
Finding a lawyer:- •National Employment Law Project: nelp.org
- •National Employment Lawyers Association: nela.org
- •State bar lawyer referral services
---
The Future of AI Hiring
What Needs to Change
Transparency: Job applicants should know when AI is used and how it works
Auditing: Regular, independent audits of AI hiring tools for bias
Accountability: Clear liability when AI discriminates
Human oversight: Meaningful human review of AI recommendations
Right to explanation: Applicants should know why they were rejected
Alternative processes: Options for candidates disadvantaged by AI
What You Can Do
Advocate for change:- •Support legislation regulating AI hiring
- •Contact state legislators about pending bills
- •Share your experiences with advocacy organizations
- •Request information about hiring processes
- •File complaints when appropriate
- •Consider joining class actions
- •Follow the Workday case developments
- •Monitor state legislation
- •Learn about AI in hiring practices
---
Resources
Federal Agencies
- •EEOC: eeoc.gov
- •Department of Labor: dol.gov
- •FTC (for deceptive AI practices): ftc.gov
Advocacy Organizations
- •National Employment Law Project: nelp.org
- •AI Now Institute: ainowinstitute.org
- •Electronic Frontier Foundation: eff.org
- •Upturn: upturn.org
Legal Help
- •National Employment Lawyers Association: nela.org
- •Legal Aid at Work: legalaidatwork.org
- •State bar associations: Find through American Bar Association
Research
- •AI fairness research: Partnership on AI (partnershiponai.org)
- •Algorithmic bias studies: Various academic institutions
- •Industry accountability: Data & Society (datasociety.net)
---
Conclusion: Algorithms Can Be Held Accountable
When Derek Mobley filed his lawsuit against Workday, many predicted it would be dismissed. AI systems were seen as too complex, too opaque, and too new for traditional discrimination law to apply.
They were wrong. The case is proceeding, potentially covering hundreds of millions of job applicants. The algorithms that silently rejected qualified candidates are being exposed to legal scrutiny.
The key lessons:
- •AI discrimination is real: 1.1 billion rejected applications isn't a glitch—it's a system working as designed, with discriminatory effects
- •The law applies: Title VII, ADEA, and ADA cover AI hiring tools, even without new legislation
- •You have rights: File EEOC charges, pursue state remedies, join class actions
- •States are acting: Even as federal enforcement retreats, state laws are expanding
- •Change is possible: Public pressure and litigation are forcing transparency and accountability
The promise of AI in hiring was to reduce human bias—to judge candidates on qualifications alone. Instead, many systems have automated discrimination at unprecedented scale.
But accountability is coming. The algorithms that reject qualified candidates are finally being questioned. And you have the power to demand answers.
Your qualifications matter. Your experience matters. Your rights matter. Don't let an algorithm have the last word.
---
This guide provides general information about employment discrimination and AI hiring practices. It does not constitute legal advice. Employment laws vary by jurisdiction. Consult with an employment attorney for specific situations.
Sources: EEOC, National Employment Law Project, HR Dive, FairNow
Last Updated: December 2025