TikTok’s AI Content Moderation Puts Hundreds of UK Jobs at Risk

TikTok plans to cut hundreds of UK trust and safety jobs as AI takes on more content moderation tasks. Some roles will shift to Europe, but UK hiring continues.

Categorized in: AI News Operations
Published on: Aug 23, 2025
TikTok’s AI Content Moderation Puts Hundreds of UK Jobs at Risk

Hundreds of Jobs at Risk in TikTok’s UK Operation

TikTok is set to reduce hundreds of jobs in its UK trust and safety teams as it expands the use of artificial intelligence (AI) in content moderation. This change is part of a broader restructuring affecting roles in the UK, South Asia, and South East Asia. Some tasks will be moved to other European sites, though TikTok has not specified which locations will take on this work.

Despite these cuts, TikTok’s Irish operations are expected to remain unaffected. The company currently employs around 2,500 people in the UK, and many trust and safety roles, along with other operations jobs, will continue there.

What TikTok Says About the Restructuring

A company spokesperson explained, “We are continuing a reorganisation started last year to strengthen our global operating model for trust and safety. This includes consolidating operations into fewer locations to improve efficiency and speed, leveraging technological advancements.”

Although some roles are being cut, TikTok is still hiring in the UK and expects to end the year with more staff than it began with. Employees impacted by the changes will have the opportunity to apply for internal roles, with preference given to suitable candidates.

How AI Is Changing Content Moderation

TikTok’s human moderators are trained to detect accounts that might be used by children and can suspend these accounts. AI systems help by analyzing keywords and reports from users to flag potentially underage accounts.

Automated technology also reduces moderators’ exposure to harmful and distressing content. Over the past year, TikTok reports a 60% decrease in such content viewed by human teams. Currently, more than 85% of content removed for violating community guidelines is detected by automated systems.

Regulatory Context in the UK

TikTok’s UK operations must comply with the Online Safety Act, which came into force last month. This law requires platforms to protect users from illegal material, including child sexual abuse and extreme pornography. It also mandates preventing children from accessing harmful or age-inappropriate content.

  • Job shifts focus on trust and safety teams
  • AI plays a bigger role in content moderation
  • Some UK jobs cut; others remain and new roles open
  • Compliance with the UK’s Online Safety Act is mandatory

For operations professionals interested in AI-driven content moderation and managing workforce changes, learning about AI tools and automation can provide valuable skills. Explore relevant courses at Complete AI Training to stay ahead in this evolving environment.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)