The TUC has called on TikTok to “come clean” over plans to axe more than 400 UK jobs and replace them with AI-driven content moderation and lower-paid overseas workers, as MPs launch an inquiry into the impact of the proposed cuts.
The Science, Innovation and Technology Committee, chaired by Chi Onwurah MP, has written to TikTok demanding answers about the redundancies within its London-based Trust and Safety Team – the group responsible for protecting the platform’s 30 million UK users from harmful content such as deepfakes, toxicity, and abuse.
The committee has asked whether TikTok has carried out a risk assessment on user safety, giving the company until 10th November to respond.
Unions and online safety campaigners first raised the alarm earlier this month, warning that the job losses could leave millions of users – including around one million children under 13 – exposed to greater online harms.
The redundancies also come just days before staff were due to vote on union recognition with the United Tech and Allied Workers branch of the Communication Workers Union (CWU).
In an open letter to MPs, the TUC said there is “no business case” for the cuts given TikTok’s strong financial performance and accused the company of “union-busting” at the expense of both workers’ rights and user safety.
The letter also expressed concern that human moderators were being replaced by “unproven AI” systems and overseas contractors “subject to gruelling conditions, poverty pay and precarity.”
TUC General Secretary Paul Nowak said: “It is time for TikTok to come clean. Time and time again they have been asked about the impact of these cuts on the safety of millions of Brits, and time and time again they have failed to provide a good enough answer.
“The Science, Innovation and Technology Committee has acted swiftly with its probe, now TikTok must explain how they are going to keep users safe if these cuts to safety-critical jobs go ahead.”
A spokesperson from TikTok said: “As we laid out in our letter to the committee, we strongly reject these claims.
“This reorganisation of our global operating model for Trust and Safety will ensure we maximize effectiveness and speed in our moderation processes as we evolve this critical safety function for the company with the benefit of technological advancements.”
 
            
 Jessica O'Connor
Jessica O'Connor 
                             
                             
                             
                             
                             
                             
                            

 
                             
                             
                             
                             
                             
                