The AI Hiring Lawsuit: What Mobley v. Workday Means for the Future of AI in Recruiting

ai hiring lawsuit mobley v workday

We’ve talked at length about the use of AI in hiring, but legal guidance and precedent have been largely lacking—until now. 

On May 16th, a US District Judge ruled that the Mobley v. Workday lawsuit could continue as a collective action lawsuit, adding potentially millions of plaintiffs to the case. The suit alleges that Workday’s AI screening tools discriminated against candidates on the basis of age, race, and disability status. 

While the case has not yet been decided, it’s raising important questions for businesses: Can employers be held liable if the software they use is discriminatory? How do we get the tools we need to grow while avoiding risk? Is it safe to use AI at all? 

Let’s talk about it. 

Before we dive in, a disclaimer: Hellohire is a recruiting platform with incorporated AI tools. We clearly have a bias, in that we believe that AI can and should be used in recruiting; it’s what we help our customers do every day. 

However, this also means we’ve seen it all when it comes to AI in hiring: the good, the bad, the not-actually-even-AI, and the tools that damage your hiring process, leaving you understaffed and vulnerable to allegations of discrimination. Improving trust in AI through expert guidance and transparency is in our best interest, as well as yours.

What is Mobley v. Workday?

In short, this collective action lawsuit alleges that Workday’s applicant screening products are discriminatory. The plaintiffs allege that Workday’s algorithm “disproportionately disqualifies individuals over the age of forty (40) from securing gainful employment.”

Derek Mobley first sued Workday last year, alleging that he was rejected from over 100 jobs using Workday due to his age, race, and disabilities. As a particular example, he cites an occurrence when his application was rejected in the middle of the night, within an hour of applying. 

Since his original suit, four additional plaintiffs over the age of 40 have joined with allegations of age discrimination in violation of the Age Discrimination in Employment Act (ADEA). One of them, Jill Hughes, states she received similar automated rejection emails outside of business hours, indicating her applications had never been reviewed by a human being. She says several of the emails incorrectly stated that she didn’t meet the minimum requirements for the role.

The plaintiffs are seeking unspecified damages and changes to Workday’s policies.

The State of AI in Hiring

This is far from the first time AI’s effects on the job market have come under fire; we’ve covered how AI is affecting both job applicants and recruiters multiple times in our biweekly newsletter, Hiring at Scale. Here’s what we’ve seen in the last few months:

Candidates are increasingly using AI to mass-apply to jobs, whether or not they’re qualified for the position. As a result, recruiters’ average workload increased to 588 applicants in Q3 of 2024, a 26% increase from just a year prior. That’s only expected to increase moving forward.

At the same time, the AI boom is causing a gold rush of companies scrambling to develop “AI-enabled” hiring tools to cash in. Oftentimes, these tools aren’t well-considered or, sometimes, aren’t AI at all. In fact, a prospective customer admittedly lost 650 out of 800 candidates in a month (a whopping 81%) to an unintuitive “AI” chatbot. 

Many “AI” candidate engagement tools are just rules-based chatbots with limited responses to specific keywords, making the candidate experience more frustrating. If your hiring process runs on the same technology as the least helpful customer service interaction you’ve ever had, you’re going to lose candidates.

The result is a self-defeating cycle where recruiters receive a flood of unqualified applicants and turn to subpar tools to handle an ever-increasing workload. The candidate experience gets worse, leading more applicants to outsource their job search to AI, and the cycle continues.

What Does This Mean for Recruiters? 

Do we just swear off of using AI in hiring at all? 

Not necessarily, but it does mean recruiters need to be aware of the potential for algorithmic bias. According to IBM, “Algorithmic bias occurs when systematic errors in machine learning algorithms produce unfair or discriminatory outcomes. It often reflects or reinforces existing socioeconomic, racial and gender biases.”

Essentially, AI is not impartial. Since it’s created and trained by humans, it has the potential to reflect existing issues in hiring, whether or not that’s the user’s intention in employing AI tools. 

So what should recruiters do to address the risks of algorithmic bias in hiring?

Use AI (and All Tools) Strategically

We at Hellohire clearly believe in the use of AI in recruiting. However, it’s critical to understand why and how you’re implementing AI in the first place, as with any other software you use. Get clear on how new tools will influence hiring decisions to identify areas of risk.

AI can and should be used to automate the tedious tasks that get in the way of the human aspect of hiring: attracting and engaging high quality candidates to build trust and long-term business growth.

Provide Clarity on Hiring Decisions

If a candidate is rejected, provide feedback whenever possible. Hellohire customers can use automated questionnaires to ensure that applicants meet minimum requirements for the role. If a candidate is rejected, they’re given immediate feedback as to why. 

This added transparency both protects your business and encourages motivated candidates to address any issues (such as a missing certification or an error in their application) and reapply, meaning you don’t lose qualified potential hires due to a typo.

When in Doubt, Ask for Help

This is an evolving field and regulatory guidance, in many cases, simply does not exist. As a result, it can feel like using any AI in recruiting can open your business up to risk. 

When thoughtful tools are utilized correctly, that’s not the case—and you don’t have to do it alone. If you have questions about implementing AI in your own recruiting, we’re here to help. Reach out to be connected with a hiring specialist here. 

Special thanks to Amanda Sternklar, a home care expert who supported the research and writing of this article.