Investigation reveals concerns regarding TikTok’s safeguards for child users
The investigation found evidence suggesting preferential treatment for potentially underage accounts with special tags.
TikTok is facing scrutiny over its safeguards for child users following an investigation by The Guardian. The investigation revealed that moderators were instructed to allow users under 13 to remain on the platform if they claimed their parents were managing their accounts. This contradicts TikTok’s stated policy that only users aged 13 and above are permitted on the platform. Evidence was found that moderators were advised in meetings to allow accounts to stay on the platform if a parent was seen in the background of a video or if the account bio indicated parental management.
TikTok denies these allegations, stating that its community guidelines apply equally to all content and that it does not allow children under 13 on its platform. The company claims the allegations are either false or based on misunderstandings and argues that The Guardian has not provided enough information to investigate further.
The investigation also suggests that potentially underage accounts received preferential treatment through internal tags, such as the ‘top creator’ label. This raises concerns about the platform’s moderation practices and its enforcement of age restrictions.
Why does it matter?
The regulatory frameworks, such as the UK’s Children’s Code and the EU’s Digital Services Act, aim to address these concerns. Still, the effectiveness of their enforcement remains a subject of debate. This is not the first time TikTok has faced criticism regarding handling underage accounts. In September, the Irish data watchdog fined TikTok for violating the EU data law about children’s accounts. Similarly, in April, the UK data regulator imposed a fine on TikTok for allegedly misusing children’s data without parental consent.
TikTok is regulated in the UK by Ofcom under its video-sharing platform rules, which are being integrated into the Online Safety Act. The act requires platforms to consistently enforce their terms of service, including age restrictions, and demonstrate measures to prevent underage access. In the EU, TikTok is subject to the Digital Services Act, which mandates measures to protect children from harmful content and prohibits using under-18s’ data for targeted advertising.