TikTok moderators have accused the social media company of “oppressive and intimidating” unions after it fired hundreds of workers in the UK, a process that began just before a vote on forming a union.
Moderators wanted to set up a collective bargaining unit to protect themselves from the personal costs of screening extreme and violent content, and have claimed that TikTok is guilty of unfair dismissal and breaching trade union laws.
Nearly 400 arbitrators were fired in London before Christmas in a process launched a week before the vote took place.
TikTok, which has about 30 million monthly users in the UK, has strongly rejected a legal claim filed in an employment tribunal on behalf of three former employees, calling it “baseless”.
It said the dismissals involving roles in the UK and South and South-East Asia were part of a global restructuring amid the growing use of AI to automatically remove posts that breach content rules, with 91% of infringing content now automatically removed.
But John Chadfield, national officer for tech workers at the Communication Workers Union, which represents about 250 of the affected moderators, said of the legal action: “This is holding TikTok responsible for union busting.”
He added: “Content moderators have the most dangerous job on the internet. They’re exposed to pedophilia material, executions, war, and drug use. Their job is to make sure this content doesn’t reach TikTok’s 30 million monthly users. It’s high pressure and underpaid. They wanted input into their workflow and more say about how they kept the platform safe. They said they were being asked to do too much with too few resources. Is.”
A TikTok spokesperson said: “These changes were part of a broader global restructuring, as we evolve our global operating model to build trust and safety while taking advantage of technological advancements to continue to maximize protection for our users.”
The dispute began in August 2025, when the union was set to vote out several hundred moderators and quality assurance agents from TikTok’s trust and safety team, whose job it was to check posts for compliance with TikTok’s rules — including painful posts processed at high speed. According to the legal claim, TikTok announced a restructuring exercise that put members of the proposed bargaining unit at risk of redundancies.
Rosa Curling, co-executive director of the tech justice nonprofit Foxglove, which is supporting the action, called TikTok’s treatment of its content moderators “appalling.”
“By laying off essential safety workers they are putting the platform’s users at risk”, he said, adding: “TikTok has made its position clear: union busting and trampling on our labor laws comes first – the safety of its users, including millions of children, and the well-being of its essential safety workers comes last. We hope the employment tribunal will force them to change course.”
According to TikTok, the increased use of AI has reduced moderators’ exposure to graphic content by 76% in the past year.
Michael Newman, partner at law firm Leigh Day, said: “This case is an important example of how individuals banding together can confront the power of big tech companies, and especially how the fig leaf of AI cost savings should not be allowed to obscure important security concerns.”
