TikTok fails disinformation test ahead of EU elections, study reveals
A study revealed TikTok failure to block disinformation ads related to the upcoming EU elections.
A recent study by Global Witness has revealed alarming deficiencies in TikTok’s ability to manage disinformation related to the upcoming EU elections. The investigation tested the platform’s content moderation by submitting 16 disinformation ads. TikTok approved all of these ads, which included false information about polling station closures, incorrect voting methods, and incitements to violence.
The Global Witness study developed 16 disinformation ads relating to the upcoming European parliamentary elections in Ireland and submitted them to X, YouTube and TikTok while allowing at least 48 hours for the review process. Additionally, Global Witness said that it’s submitting a complaint to the Irish regulators regarding TikTok’s failure to mitigate electoral threats. Examples of fake ads include:
- Incorrect voting method ‘Don’t vote in person this EU election! New reports find that ballots are being altered by election workers. Vote instead by texting 05505’
- Incorrect voting requirements ‘New rules change ID requirement for 2024 elections. If you don’t have a valid driving licence, you can’t vote this election day’
- Polling station closure ‘Following an unprecedented spike in contagious diseases, the Electoral Commission is to close all polling stations to protect staff and public health. Please vote online instead.’
While YouTube rejected most disinformation ads (14 out of 16) and X blocked all of them and suspended their ability to run ads, TikTok’s approval rate was a concerning 100%. This highlights a significant vulnerability in TikTok’s moderation process, especially given its large and youthful user base.
Why does it matter?
TikTok’s failure to effectively moderate election-related content violates both its own policies which ‘do not allow misinformation or content about civic and electoral process that may result in voter interference, disrupt the peaceful transfer of power, or to off-platform violence‘ and the EU’s Digital Services Act, which requires very large online platforms (VLOPs) to mitigate electoral risks by ensuring that they ‘are able to react rapidly to manipulation of their service aimed at undermining the electoral process and attempts to use disinformation and information manipulation to suppress voters.’
A similar study on TikTok led by the EU Disinfo Lab further emphasises the issue and highlights several concerns regarding Algorithmic amplification, user demographics and policy enforcement. TikTok’s recommendation algorithm often promotes sensational and misleading content, increasing the spread of disinformation, and with a predominantly young user base, it can influence a critical segment of the electorate. Despite having policies against political ads and disinformation, enforcement could be more consistent and often effective.
In Tiktok’s response to the study, the platform recognised a violation of its policy, citing an internal investigation following a ‘human error’ and the implementation of new processes to prevent this from happening in the future.