Employers increasingly use AI to decide who gets laid off, survey finds

Over half of HR directors now use AI to recommend layoffs, a survey of 1,000 hiring managers found. In April alone, AI-driven cuts accounted for 26% of all announced layoffs.

Categorized in: AI News Human Resources
Published on: May 15, 2026
Employers increasingly use AI to decide who gets laid off, survey finds

Employers Turn to AI for Layoff Decisions, Raising Fairness Concerns

More than half of HR directors use artificial intelligence to analyze workforce data and recommend which employees to cut during restructuring, according to a survey of 1,000 hiring managers. The shift marks a significant expansion beyond AI's traditional role in screening resumes.

52% of surveyed HR professionals said they use AI to generate productivity data for "workforce planning decisions, including restructuring and role evaluation." Another 28% said they were considering the same approach. Only 20% said they had no plans to use AI for layoff decisions.

The findings come as companies cite AI as their top reason for downsizing. In April alone, AI accounted for 21,490 job cuts announced by employers-26% of all announced layoffs that month, according to the consulting firm Challenger, Gray and Christmas. Through April, AI has driven roughly 16% of job cut plans announced this year, up from 13% through March.

The Scale of Job Losses

Technology and information services sectors have absorbed the heaviest losses. In April, the information services sector lost 13,000 jobs while most other industries added positions. Overall, employers added 115,000 jobs that month, but the unemployment rate remained flat at 4.3%.

"Regardless of whether individual jobs are being replaced by AI, the money for those roles is," said Andy Challenger, chief revenue officer at Challenger, Gray and Christmas.

How AI Screening Works-and What It Misses

73% of hiring directors use AI to manage application volume. The technology filters candidates automatically: 65% of HR professionals surveyed said their AI systems rejected applicants before any human reviewed them. 47% acknowledged their AI may have filtered out candidates they would have wanted to consider.

These automated rejections happen early in the hiring process, before hiring managers have a chance to assess factors like potential, growth trajectory, or cultural fit. "When teams lean too heavily on AI, decisions can become overly data-driven, missing important context," said Jasmine Escalera, a career expert at MyPerfectResume.

Bias and Legal Risk

HR departments face real liability when AI systems discriminate. Amazon shut down its AI recruiting assistant in 2018 after discovering it had developed bias against female resumes. Similar problems persist across the industry.

Employment attorneys warn that many HR managers cannot identify when job applications have been written with AI tools like JobCopilot or LazyApply. This creates risk: if an AI hiring tool produces a discriminatory impact based on race, sex, age, disability, or religion, employers remain legally responsible.

"Many employers now receive hundreds or thousands of applications for a single role, so automated screening tools have become almost unavoidable," said Haley Harrigan, employment law chair at Gallagher and Kennedy in Phoenix. "If an AI hiring tool produces a discriminatory impact, the employer is still responsible and could face discrimination exposure under federal and state employment laws."

Worker Concerns Run High

60% of workers surveyed by Resume Now said AI would eliminate more jobs than it creates by year's end. 51% worried about losing their jobs to AI. 46% said a bot could replace them by 2030.

Only 1 in 10 workers trusted AI to make fair hiring decisions, according to a separate AI Fatigue Survey. 36% of employees feared being replaced by technology.

The Case for Human Judgment

51% of hiring managers expressed confidence that "AI is used fairly in layoffs." 23% expressed doubts. 26% said they didn't use AI in layoff decisions at all.

Some HR leaders question whether AI can evaluate what matters most. "AI can't decide what is a good culture fit for your company, and evaluate someone's work ethic," said Andrew Crapuchettes, CEO of RedBalloon, a jobs board. "The adoption of AI in HR today feels like a fad-driven bubble that is fit to burst in a bad way."

Experts recommend transparency. Employers should clearly communicate their AI policies to workers and applicants. They should also maintain human review at critical decision points-especially before rejecting candidates or recommending layoffs.

Carolyn Illman, a Seattle-based hiring coach, said the ability to show vulnerability and build human connections will become more valuable as companies expand their use of bots. "AI isn't able to take the place of a human," she said.

For HR professionals, the challenge is clear: use AI to handle volume and identify patterns, but preserve human judgment where it matters most. Legal and HR departments should collaborate on policies before implementing AI for workforce decisions. The cost of getting it wrong-in legal liability, employee morale, and missed talent-is too high.

Learn more about AI for Human Resources or explore an AI Learning Path for CHROs.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)